Pneumonia is an infection in one or both lungs. Bacteria, viruses, and fungi cause it. The infection causes inflammation in the air sacs in your lungs, which are called alveoli.
The alveoli fill with fluid or pus, making it difficult to breathe.Pneumonia is a lung infection that can range from mild to so severe that you have to go to the hospital.

Pneumonia accounts for over 15% of all deaths of children under 5 years old internationally. In 2017, 920,000 children under the age of 5 died from the disease. It requires review of a chest radiograph (CXR) by highly trained specialists and confirmation through clinical history, vital signs and laboratory exams. Pneumonia usually manifests as an area or areas of increased opacity on CXR. However, the diagnosis of pneumonia on CXR is complicated because of a number of other conditions in the lungs such as fluid overload (pulmonary edema), bleeding, volume loss (atelectasis or collapse), lung cancer, or post-radiation or surgical changes. Outside of the lungs, fluid in the pleural space (pleural effusion) also appears as increased opacity on CXR. When available, comparison of CXRs of the patient taken at different time points and correlation with clinical symptoms and history are helpful in making the diagnosis.
CXRs are the most commonly performed diagnostic imaging study. A number of factors such as positioning of the patient and depth of inspiration can alter the appearance of the CXR, complicating interpretation further. In addition, clinicians are faced with reading high volumes of images every shift.
Now to detect Pneumonia, we need to detect Inflammation of the lungs. In this project, you’re challenged to build an algorithm to detect a visual signal for pneumonia in medical images. Specifically, your algorithm needs to automatically locate lung opacities on chest radiographs.
Sometimes pneumonia can be difficult to diagnose because the symptoms are so variable, and are often very similar to those seen in a cold or influenza. To diagnose pneumonia, and to try to identify the germ that is causing the illness, your doctor will ask questions about your medical history, do a physical exam, and run some tests.
Your doctor will ask you questions about your signs and symptoms, and how and when they began. To help figure out if your infection is caused by bacteria, viruses or fungi, you may be asked some questions about possible exposures, such as:
Your doctor will listen to your lungs with a stethoscope. If you have pneumonia, your lungs may make crackling, bubbling, and rumbling sounds when you inhale.
If your doctor suspects you may have pneumonia, they will probably recommend some tests to confirm the diagnosis and learn more about your infection. These may include:
1 Blood tests to confirm the infection and to try to identify the germ that is causing your illness.
2) Chest X-ray to look for the location and extent of inflammation in your lungs.
3) Pulse oximetry to measure the oxygen level in your blood. Pneumonia can prevent your lungs from moving enough oxygen into your bloodstream.
4) Sputum test on a sample of mucus (sputum) taken after a deep cough, to look for the source of the infection. If you are considered a high-risk patient because of your age and overall health, or if you are hospitalized, the doctors may want to do some additional tests, including:
5) CT scan of the chest to get a better view of the lungs and look for abscesses or other complications.
6) Arterial blood gas test, to measure the amount of oxygen in a blood sample taken from an artery, usually in your wrist. This is more accurate than the simpler pulse oximetry.
7) Pleural fluid culture, which removes a small amount of fluid from around tissues that surround the lung, to analyze and identify bacteria causing the pneumonia.
8) Bronchoscopy, a procedure used to look into the lungs' airways. If you are hospitalized and your treatment is not working well, doctors may want to see whether something else is affecting your airways, such as a blockage. They may also take fluid samples or a biopsy of lung tissue.
Automating Pneumonia screening in chest radiographs, providing affected area details through bounding box.
Assist physicians to make better clinical decisions or even replace human judgement in certain functional areas of healthcare (eg, radiology).
Guided by relevant clinical questions, powerful AI techniques can unlock clinically relevant information hidden in the massive amount of data, which in turn can assist clinical decision making.
Medical images are stored in a special format called DICOM files (*.dcm). They contain a combination of header metadata as well as underlying raw image arrays for pixel data.
Dataset link: https://www.kaggle.com/c/rsna-pneumonia-detection-challenge/data
In this project, we have to predict whether pneumonia exists in a given image. This is done by predicting bounding boxes around areas of the lung. Samples without bounding boxes are negative and contain no definitive evidence of pneumonia. Samples with bounding boxes indicate evidence of pneumonia.
When making predictions, the model should predict as many bounding boxes as necessary, in the format: confidence x-min y-min width height
There will be only ONE predicted row per image. This row may include multiple bounding boxes.
A properly formatted row may look like any of the following.
For patientIds with no predicted pneumonia / bounding boxes: 0004cfab-14fd-4e49-80ba-63a80b6bddd6,
For patientIds with a single predicted bounding box: 0004cfab-14fd-4e49-80ba-63a80b6bddd6,0.5 0 0 100 100
For patientIds with multiple predicted bounding boxes: 0004cfab-14fd-4e49-80ba-63a80b6bddd6,0.5 0 0 100 100 0.5 0 0 100 100, etc.
The general format is as follows:
patientId,{confidence x-min y-min width height},{confidence x-min y-min width height}, etc.
stage_2_train_labels.csv - the training set. Contains patientIds and bounding box / target information.
stage_2_detailed_class_info.csv - provides detailed information about the type of positive or negative class for each image.
patientId_ - Each patientId corresponds to a unique image.x_ - the upper-left x coordinate of the bounding box.y_ - the upper-left y coordinate of the bounding box.width_ - the width of the bounding box.height_ - the height of the bounding box.Target_ - the binary Target, indicating whether this sample has evidence of pneumonia.Tissues with sparse material, such as lungs which are full of air, do not absorb the X-rays and appear black in the image. Dense tissues such as bones absorb X-rays and appear white in the image.
While we are theoretically detecting “lung opacities”, there are lung opacities that are not pneumonia related.
In the data, some of these are labeled “Not Normal No Lung Opacity”.This extra third class indicates that while pneumonia was determined not to be present, there was nonetheless some type of abnormality on the image and often times this finding may mimic the appearance of true pneumonia.
It's important to note that the various shades of gray in the chest X-Ray refer to the following:
In a normal image (shown above) we see the lungs as black, but they have different projections on them - mainly the rib cage bones, main airways, blood vessels and the heart.
In case of pneumonia, a haziness (also referred to as consolidation) is present in the chest x-ray image.
Images with no lung opacity and no pneumonia are images where the patient can have rounded hazy boundaries or masses (probably because of lung nodules or masses which can be because of cancer).
There are other exceptional cases as well where there can be no lung opacity but no pneumonia either. Some of these cases include pneumonectomy (lung removed by surgery), enlarged heart, pleural effusion, etc.
Reference: https://www.kaggle.com/zahaviguy/what-are-lung-opacities
#Mounting Google CoLab
from google.colab import drive
drive.mount('/content/drive/',force_remount=True)
Mounted at /content/drive/
import cv2
import pickle
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import pylab
!pip install pydicom
!pip install mrcnn
import pydicom as pyd
import seaborn as sns
from tqdm import tqdm
from sklearn.preprocessing import OneHotEncoder
from sklearn.metrics import confusion_matrix
from keras.models import Model, load_model
from keras.layers import Dense, Input, Conv2D, MaxPool2D, Flatten
from keras.preprocessing.image import ImageDataGenerator
from glob import glob
import os
from matplotlib.patches import Rectangle
from mrcnn.config import Config
from collections import defaultdict
import matplotlib.patches as patches
from matplotlib.patches import Rectangle
import tensorflow as tf
%matplotlib inline
from tensorflow.keras.applications.mobilenet import MobileNet
from tensorflow.keras.layers import Concatenate, UpSampling2D, Conv2D, Reshape, BatchNormalization
from tensorflow.keras.models import Model
from tensorflow.keras.losses import binary_crossentropy
from tensorflow.keras.optimizers import Adam
import skimage
from skimage.transform import resize
from skimage import feature, filters
from sklearn.model_selection import train_test_split
from sklearn.metrics import confusion_matrix
from sklearn.metrics import classification_report
import random
import pickle
from sklearn.metrics import roc_curve,auc,precision_recall_curve,classification_report
Collecting pydicom
Downloading pydicom-2.2.2-py3-none-any.whl (2.0 MB)
|████████████████████████████████| 2.0 MB 5.3 MB/s
Installing collected packages: pydicom
Successfully installed pydicom-2.2.2
Collecting mrcnn
Downloading mrcnn-0.2.tar.gz (51 kB)
|████████████████████████████████| 51 kB 280 kB/s
Building wheels for collected packages: mrcnn
Building wheel for mrcnn (setup.py) ... done
Created wheel for mrcnn: filename=mrcnn-0.2-py3-none-any.whl size=54932 sha256=7f95761e83be3c25ada9df5e2b6da9fec34d41f993e6ec10727eaf07d1a3910d
Stored in directory: /root/.cache/pip/wheels/1d/94/0d/03ff96abc43d2d6c8299a92cbb4eced2a1eda3ca7911c19427
Successfully built mrcnn
Installing collected packages: mrcnn
Successfully installed mrcnn-0.2
#Unzip the "rsna-pneumonia-detection-challenge.zip" file
from zipfile import ZipFile
with ZipFile('/content/drive/MyDrive/Colab Notebooks/Capstone Project/rsna-pneumonia-detection-challenge.zip', 'r') as z:
z.extractall()
!ls '/content/drive/MyDrive/Colab Notebooks/Capstone Project'
'Capstone Project Pneumonia Detection Ronald.ipynb' cnn-segmentation-connected-components.ipynb custom-training-object-detection-models.ipynb 'GCP Credits Request Link - RSNA.txt' Pneumonia_Detection.ipynb 'Pneumonia Detection Milestone 1.ipynb' 'Pneumonia Detection Milestone 2.ipynb' 'Pneumonia Detection with PPTs' pneumonia_status stage_2_detailed_class_info.csv stage_2_sample_submission.csv stage_2_test_images stage_2_train_images stage_2_train_labels.csv 'YOLO v5.ipynb' 'YOLO v5 Pneumonia Detection.ipynb'
#Setting the project path
project_path = '/content/drive/MyDrive/Colab Notebooks/Capstone Project/'
os.chdir(project_path)
#Checking file format (.dcm) in the train images folder
for file in os.listdir((os.path.join(project_path,'stage_2_train_images'))):
if not file.endswith('.dcm'):
print(file)
#Checking file format (.dcm) in the test images folder
for file in os.listdir((os.path.join(project_path,'stage_2_test_images'))):
if not file.endswith('.dcm'):
print(file)
class_info = pd.read_csv('stage_2_detailed_class_info.csv')
labels = pd.read_csv('stage_2_train_labels.csv')
print('Size of Dataset 1: ',labels.shape)
print('Size of Dataset 2: ',class_info.shape)
print('Number of Unique X-Rays in Dataset 1 : ',labels['patientId'].nunique())
print('Number of Unique X-Rays in Dataset 2 : ',class_info['patientId'].nunique())
Size of Dataset 1: (30227, 6) Size of Dataset 2: (30227, 2) Number of Unique X-Rays in Dataset 1 : 26684 Number of Unique X-Rays in Dataset 2 : 26684
class_info.head()
| patientId | class | |
|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity |
labels.head()
| patientId | x | y | width | height | Target | |
|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | NaN | NaN | NaN | NaN | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | NaN | NaN | NaN | NaN | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | NaN | NaN | NaN | NaN | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | NaN | NaN | NaN | NaN | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | 1 |
labels.drop_duplicates(inplace=True)
class_info.drop_duplicates(inplace=True)
print('Size of Dataset 1: ',labels.shape)
print('Size of Dataset 2: ',class_info.shape)
print('Number of Unique X-Rays in Dataset 1 : ',labels['patientId'].nunique())
print('Number of Unique X-Rays in Dataset 2 : ',class_info['patientId'].nunique())
Size of Dataset 1: (30227, 6) Size of Dataset 2: (26684, 2) Number of Unique X-Rays in Dataset 1 : 26684 Number of Unique X-Rays in Dataset 2 : 26684
class_info.head()
| patientId | class | |
|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity |
labels.head()
| patientId | x | y | width | height | Target | |
|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | NaN | NaN | NaN | NaN | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | NaN | NaN | NaN | NaN | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | NaN | NaN | NaN | NaN | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | NaN | NaN | NaN | NaN | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | 1 |
Compare the labels and class information for possible join
print("Shape of the train labels:", labels.shape)
print("Shape of the detailed class information:", class_info.shape)
Shape of the train labels: (30227, 6) Shape of the detailed class information: (26684, 2)
Data Inference: A join or merge should typically give us a dataset that has a shape of (30227) assuming we keep all rows and drop the redundant 'patientId' column.
Check uniqueness of the data
Approach:
There could be duplicate patientID entries, resulting to multiple bounding boxes with relative target classification/class information. Compare if the sequence of records are synchronous between the "train labels" and "class information" datasets. If synchronous then a simple join can be performed on the index
Exploring Train Labels Dataset
labels_count = pd.DataFrame()
labels_count = labels['patientId'].value_counts().value_counts().reset_index()
labels_count.columns = ['Counts', 'records in train labels']
labels_count.style.hide_index()
| Counts | records in train labels |
|---|---|
| 1 | 23286 |
| 2 | 3266 |
| 3 | 119 |
| 4 | 13 |
unique_labels = labels_count['records in train labels'].sum()
print("Total unique records in train labels = ", unique_labels)
Total unique records in train labels = 26684
Inference:
Exploring Class Information Dataset
#check this against results
class_data = pd.DataFrame()
class_data = class_info['patientId'].value_counts().value_counts().reset_index()
class_data.columns = ['counts', 'records in class info']
class_data.style.hide_index()
| counts | records in class info |
|---|---|
| 1 | 26684 |
unique_classes = class_data['records in class info'].sum()
print("Total unique records in class info = ", unique_classes)
Total unique records in class info = 26684
Merging Data
combined = pd.merge(left = class_info, right = labels, how = 'left', on = 'patientId')
combined.info(show_counts = True)
combined.head(20)
<class 'pandas.core.frame.DataFrame'> Int64Index: 30227 entries, 0 to 30226 Data columns (total 7 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 patientId 30227 non-null object 1 class 30227 non-null object 2 x 9555 non-null float64 3 y 9555 non-null float64 4 width 9555 non-null float64 5 height 9555 non-null float64 6 Target 30227 non-null int64 dtypes: float64(4), int64(1), object(2) memory usage: 1.8+ MB
| patientId | class | x | y | width | height | Target | |
|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal | NaN | NaN | NaN | NaN | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity | 264.0 | 152.0 | 213.0 | 379.0 | 1 |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity | 562.0 | 152.0 | 256.0 | 453.0 | 1 |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity | 323.0 | 577.0 | 160.0 | 104.0 | 1 |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity | 695.0 | 575.0 | 162.0 | 137.0 | 1 |
| 10 | 008c19e8-a820-403a-930a-bc74a4053664 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 11 | 009482dc-3db5-48d4-8580-5c89c4f01334 | Normal | NaN | NaN | NaN | NaN | 0 |
| 12 | 009eb222-eabc-4150-8121-d5a6d06b8ebf | Normal | NaN | NaN | NaN | NaN | 0 |
| 13 | 00a85be6-6eb0-421d-8acf-ff2dc0007e8a | Normal | NaN | NaN | NaN | NaN | 0 |
| 14 | 00aecb01-a116-45a2-956c-08d2fa55433f | Lung Opacity | 288.0 | 322.0 | 94.0 | 135.0 | 1 |
| 15 | 00aecb01-a116-45a2-956c-08d2fa55433f | Lung Opacity | 547.0 | 299.0 | 119.0 | 165.0 | 1 |
| 16 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | Lung Opacity | 306.0 | 544.0 | 168.0 | 244.0 | 1 |
| 17 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | Lung Opacity | 650.0 | 511.0 | 206.0 | 284.0 | 1 |
| 18 | 00d7c36e-3cdf-4df6-ac03-6c30cdc8e85b | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 |
| 19 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | Lung Opacity | 181.0 | 184.0 | 206.0 | 506.0 | 1 |
Shape of Dataset
combined.shape
(30227, 7)
combined['patientId'].nunique()
26684
combined['patientId'].value_counts()
3239951b-6211-4290-b237-3d9ad17176db 4
31764d54-ea3b-434f-bae2-8c579ed13799 4
1c44e0a4-4612-438f-9a83-8d5bf919cb67 4
8dc8e54b-5b05-4dac-80b9-fa48878621e2 4
349f10b4-dc3e-4f3f-b2e4-a5b81448ce87 4
..
7233ab9f-7fa6-4b5b-acdc-1c93bf945db5 1
722cca4a-f542-47ba-be14-29238c1124e1 1
722ad9c3-919a-4c08-bb67-cafd603ba754 1
7228bf11-3ce7-47f5-bee3-d36325df1dc8 1
943b27f1-ba1c-415b-b738-7b52aefa445b 1
Name: patientId, Length: 26684, dtype: int64
labels_info=combined['class'].value_counts()
explode = (0.01,0.01,0.01)
fig1, ax1 = plt.subplots(figsize=(5,5))
ax1.pie(labels_info.values, explode=explode, labels=labels_info.index, autopct='%1.1f%%',
shadow=True, startangle=90)
ax1.axis('equal')
plt.title('Class Distribution')
plt.show()
print('Total records with Lung Opacity: ', combined[combined['class'] == 'Lung Opacity']['class'].count())
print('Total records with No Lung Opacity / Not Normal: ', combined[combined['class'] == 'No Lung Opacity / Not Normal']['class'].count())
print('Total Normal records: ', combined[combined['class'] == 'Normal']['class'].count())
Total records with Lung Opacity: 9555 Total records with No Lung Opacity / Not Normal: 11821 Total Normal records: 8851
Observation:
The above graph above shows total numbers of records for the different classes. The graph shows that patients with No Lung Opacity/ Not Normal are highest number as compared to those with Lung Opacity and who are Normal patients.
8,851 (29.3%) records does not have any desease.
9,555 (31.6%) records has Lung Opacity.
11,821 (39.1%) records hs No Lung Opacity / Not Normal.
#Checking the Target Distribution
labels_info=combined['Target'].value_counts()
explode = (0.1,0.0)
fig1, ax1 = plt.subplots(figsize=(5,5))
ax1.pie(labels_info.values, explode=explode, labels=['Normal','Pneumonia'], autopct='%1.1f%%',
shadow=True, startangle=90)
ax1.axis('equal')
plt.title('Target Distribution')
plt.show()
Observation:
From the above graph, we can infer that 31.6% of people in the daset have got Pneumonia and 68.4% do not have Pneumonia.
#Appending data to the images
age, gender, view_posn, study_id = [], [], [], []
combined['patientAge'] = 0
combined['patientSex'] = ''
combined['ViewPosition'] = ''
combined['StudyID'] = ''
counter = 0
for value in combined['patientId']:
counter = counter + 1
print(counter)
patient_Target = (pd.to_numeric(combined[combined['patientId'] == value]['Target'],downcast ='signed', errors='coerce')) < 1
dcm_patientFile = project_path + '/stage_2_train_images/%s.dcm' % value
dcm_patientData = pyd.read_file(dcm_patientFile)
age.append(dcm_patientData.PatientAge)
gender.append(dcm_patientData.PatientSex)
view_posn.append(dcm_patientData.ViewPosition)
study_id.append(dcm_patientData.StudyID)
combined['patientAge'] = age
combined['patientSex'] = gender
combined['ViewPosition'] = view_posn
combined['StudyID'] = study_id
combined.head(10)
Streaming output truncated to the last 5000 lines.
25228
25229
25230
25231
25232
25233
25234
25235
25236
25237
25238
25239
25240
25241
25242
25243
25244
25245
25246
25247
25248
25249
25250
25251
25252
25253
25254
25255
25256
25257
25258
25259
25260
25261
25262
25263
25264
25265
25266
25267
25268
25269
25270
25271
25272
25273
25274
25275
25276
25277
25278
25279
25280
25281
25282
25283
25284
25285
25286
25287
25288
25289
25290
25291
25292
25293
25294
25295
25296
25297
25298
25299
25300
25301
25302
25303
25304
25305
25306
25307
25308
25309
25310
25311
25312
25313
25314
25315
25316
25317
25318
25319
25320
25321
25322
25323
25324
25325
25326
25327
25328
25329
25330
25331
25332
25333
25334
25335
25336
25337
25338
25339
25340
25341
25342
25343
25344
25345
25346
25347
25348
25349
25350
25351
25352
25353
25354
25355
25356
25357
25358
25359
25360
25361
25362
25363
25364
25365
25366
25367
25368
25369
25370
25371
25372
25373
25374
25375
25376
25377
25378
25379
25380
25381
25382
25383
25384
25385
25386
25387
25388
25389
25390
25391
25392
25393
25394
25395
25396
25397
25398
25399
25400
25401
25402
25403
25404
25405
25406
25407
25408
25409
25410
25411
25412
25413
25414
25415
25416
25417
25418
25419
25420
25421
25422
25423
25424
25425
25426
25427
25428
25429
25430
25431
25432
25433
25434
25435
25436
25437
25438
25439
25440
25441
25442
25443
25444
25445
25446
25447
25448
25449
25450
25451
25452
25453
25454
25455
25456
25457
25458
25459
25460
25461
25462
25463
25464
25465
25466
25467
25468
25469
25470
25471
25472
25473
25474
25475
25476
25477
25478
25479
25480
25481
25482
25483
25484
25485
25486
25487
25488
25489
25490
25491
25492
25493
25494
25495
25496
25497
25498
25499
25500
25501
25502
25503
25504
25505
25506
25507
25508
25509
25510
25511
25512
25513
25514
25515
25516
25517
25518
25519
25520
25521
25522
25523
25524
25525
25526
25527
25528
25529
25530
25531
25532
25533
25534
25535
25536
25537
25538
25539
25540
25541
25542
25543
25544
25545
25546
25547
25548
25549
25550
25551
25552
25553
25554
25555
25556
25557
25558
25559
25560
25561
25562
25563
25564
25565
25566
25567
25568
25569
25570
25571
25572
25573
25574
25575
25576
25577
25578
25579
25580
25581
25582
25583
25584
25585
25586
25587
25588
25589
25590
25591
25592
25593
25594
25595
25596
25597
25598
25599
25600
25601
25602
25603
25604
25605
25606
25607
25608
25609
25610
25611
25612
25613
25614
25615
25616
25617
25618
25619
25620
25621
25622
25623
25624
25625
25626
25627
25628
25629
25630
25631
25632
25633
25634
25635
25636
25637
25638
25639
25640
25641
25642
25643
25644
25645
25646
25647
25648
25649
25650
25651
25652
25653
25654
25655
25656
25657
25658
25659
25660
25661
25662
25663
25664
25665
25666
25667
25668
25669
25670
25671
25672
25673
25674
25675
25676
25677
25678
25679
25680
25681
25682
25683
25684
25685
25686
25687
25688
25689
25690
25691
25692
25693
25694
25695
25696
25697
25698
25699
25700
25701
25702
25703
25704
25705
25706
25707
25708
25709
25710
25711
25712
25713
25714
25715
25716
25717
25718
25719
25720
25721
25722
25723
25724
25725
25726
25727
25728
25729
25730
25731
25732
25733
25734
25735
25736
25737
25738
25739
25740
25741
25742
25743
25744
25745
25746
25747
25748
25749
25750
25751
25752
25753
25754
25755
25756
25757
25758
25759
25760
25761
25762
25763
25764
25765
25766
25767
25768
25769
25770
25771
25772
25773
25774
25775
25776
25777
25778
25779
25780
25781
25782
25783
25784
25785
25786
25787
25788
25789
25790
25791
25792
25793
25794
25795
25796
25797
25798
25799
25800
25801
25802
25803
25804
25805
25806
25807
25808
25809
25810
25811
25812
25813
25814
25815
25816
25817
25818
25819
25820
25821
25822
25823
25824
25825
25826
25827
25828
25829
25830
25831
25832
25833
25834
25835
25836
25837
25838
25839
25840
25841
25842
25843
25844
25845
25846
25847
25848
25849
25850
25851
25852
25853
25854
25855
25856
25857
25858
25859
25860
25861
25862
25863
25864
25865
25866
25867
25868
25869
25870
25871
25872
25873
25874
25875
25876
25877
25878
25879
25880
25881
25882
25883
25884
25885
25886
25887
25888
25889
25890
25891
25892
25893
25894
25895
25896
25897
25898
25899
25900
25901
25902
25903
25904
25905
25906
25907
25908
25909
25910
25911
25912
25913
25914
25915
25916
25917
25918
25919
25920
25921
25922
25923
25924
25925
25926
25927
25928
25929
25930
25931
25932
25933
25934
25935
25936
25937
25938
25939
25940
25941
25942
25943
25944
25945
25946
25947
25948
25949
25950
25951
25952
25953
25954
25955
25956
25957
25958
25959
25960
25961
25962
25963
25964
25965
25966
25967
25968
25969
25970
25971
25972
25973
25974
25975
25976
25977
25978
25979
25980
25981
25982
25983
25984
25985
25986
25987
25988
25989
25990
25991
25992
25993
25994
25995
25996
25997
25998
25999
26000
26001
26002
26003
26004
26005
26006
26007
26008
26009
26010
26011
26012
26013
26014
26015
26016
26017
26018
26019
26020
26021
26022
26023
26024
26025
26026
26027
26028
26029
26030
26031
26032
26033
26034
26035
26036
26037
26038
26039
26040
26041
26042
26043
26044
26045
26046
26047
26048
26049
26050
26051
26052
26053
26054
26055
26056
26057
26058
26059
26060
26061
26062
26063
26064
26065
26066
26067
26068
26069
26070
26071
26072
26073
26074
26075
26076
26077
26078
26079
26080
26081
26082
26083
26084
26085
26086
26087
26088
26089
26090
26091
26092
26093
26094
26095
26096
26097
26098
26099
26100
26101
26102
26103
26104
26105
26106
26107
26108
26109
26110
26111
26112
26113
26114
26115
26116
26117
26118
26119
26120
26121
26122
26123
26124
26125
26126
26127
26128
26129
26130
26131
26132
26133
26134
26135
26136
26137
26138
26139
26140
26141
26142
26143
26144
26145
26146
26147
26148
26149
26150
26151
26152
26153
26154
26155
26156
26157
26158
26159
26160
26161
26162
26163
26164
26165
26166
26167
26168
26169
26170
26171
26172
26173
26174
26175
26176
26177
26178
26179
26180
26181
26182
26183
26184
26185
26186
26187
26188
26189
26190
26191
26192
26193
26194
26195
26196
26197
26198
26199
26200
26201
26202
26203
26204
26205
26206
26207
26208
26209
26210
26211
26212
26213
26214
26215
26216
26217
26218
26219
26220
26221
26222
26223
26224
26225
26226
26227
26228
26229
26230
26231
26232
26233
26234
26235
26236
26237
26238
26239
26240
26241
26242
26243
26244
26245
26246
26247
26248
26249
26250
26251
26252
26253
26254
26255
26256
26257
26258
26259
26260
26261
26262
26263
26264
26265
26266
26267
26268
26269
26270
26271
26272
26273
26274
26275
26276
26277
26278
26279
26280
26281
26282
26283
26284
26285
26286
26287
26288
26289
26290
26291
26292
26293
26294
26295
26296
26297
26298
26299
26300
26301
26302
26303
26304
26305
26306
26307
26308
26309
26310
26311
26312
26313
26314
26315
26316
26317
26318
26319
26320
26321
26322
26323
26324
26325
26326
26327
26328
26329
26330
26331
26332
26333
26334
26335
26336
26337
26338
26339
26340
26341
26342
26343
26344
26345
26346
26347
26348
26349
26350
26351
26352
26353
26354
26355
26356
26357
26358
26359
26360
26361
26362
26363
26364
26365
26366
26367
26368
26369
26370
26371
26372
26373
26374
26375
26376
26377
26378
26379
26380
26381
26382
26383
26384
26385
26386
26387
26388
26389
26390
26391
26392
26393
26394
26395
26396
26397
26398
26399
26400
26401
26402
26403
26404
26405
26406
26407
26408
26409
26410
26411
26412
26413
26414
26415
26416
26417
26418
26419
26420
26421
26422
26423
26424
26425
26426
26427
26428
26429
26430
26431
26432
26433
26434
26435
26436
26437
26438
26439
26440
26441
26442
26443
26444
26445
26446
26447
26448
26449
26450
26451
26452
26453
26454
26455
26456
26457
26458
26459
26460
26461
26462
26463
26464
26465
26466
26467
26468
26469
26470
26471
26472
26473
26474
26475
26476
26477
26478
26479
26480
26481
26482
26483
26484
26485
26486
26487
26488
26489
26490
26491
26492
26493
26494
26495
26496
26497
26498
26499
26500
26501
26502
26503
26504
26505
26506
26507
26508
26509
26510
26511
26512
26513
26514
26515
26516
26517
26518
26519
26520
26521
26522
26523
26524
26525
26526
26527
26528
26529
26530
26531
26532
26533
26534
26535
26536
26537
26538
26539
26540
26541
26542
26543
26544
26545
26546
26547
26548
26549
26550
26551
26552
26553
26554
26555
26556
26557
26558
26559
26560
26561
26562
26563
26564
26565
26566
26567
26568
26569
26570
26571
26572
26573
26574
26575
26576
26577
26578
26579
26580
26581
26582
26583
26584
26585
26586
26587
26588
26589
26590
26591
26592
26593
26594
26595
26596
26597
26598
26599
26600
26601
26602
26603
26604
26605
26606
26607
26608
26609
26610
26611
26612
26613
26614
26615
26616
26617
26618
26619
26620
26621
26622
26623
26624
26625
26626
26627
26628
26629
26630
26631
26632
26633
26634
26635
26636
26637
26638
26639
26640
26641
26642
26643
26644
26645
26646
26647
26648
26649
26650
26651
26652
26653
26654
26655
26656
26657
26658
26659
26660
26661
26662
26663
26664
26665
26666
26667
26668
26669
26670
26671
26672
26673
26674
26675
26676
26677
26678
26679
26680
26681
26682
26683
26684
26685
26686
26687
26688
26689
26690
26691
26692
26693
26694
26695
26696
26697
26698
26699
26700
26701
26702
26703
26704
26705
26706
26707
26708
26709
26710
26711
26712
26713
26714
26715
26716
26717
26718
26719
26720
26721
26722
26723
26724
26725
26726
26727
26728
26729
26730
26731
26732
26733
26734
26735
26736
26737
26738
26739
26740
26741
26742
26743
26744
26745
26746
26747
26748
26749
26750
26751
26752
26753
26754
26755
26756
26757
26758
26759
26760
26761
26762
26763
26764
26765
26766
26767
26768
26769
26770
26771
26772
26773
26774
26775
26776
26777
26778
26779
26780
26781
26782
26783
26784
26785
26786
26787
26788
26789
26790
26791
26792
26793
26794
26795
26796
26797
26798
26799
26800
26801
26802
26803
26804
26805
26806
26807
26808
26809
26810
26811
26812
26813
26814
26815
26816
26817
26818
26819
26820
26821
26822
26823
26824
26825
26826
26827
26828
26829
26830
26831
26832
26833
26834
26835
26836
26837
26838
26839
26840
26841
26842
26843
26844
26845
26846
26847
26848
26849
26850
26851
26852
26853
26854
26855
26856
26857
26858
26859
26860
26861
26862
26863
26864
26865
26866
26867
26868
26869
26870
26871
26872
26873
26874
26875
26876
26877
26878
26879
26880
26881
26882
26883
26884
26885
26886
26887
26888
26889
26890
26891
26892
26893
26894
26895
26896
26897
26898
26899
26900
26901
26902
26903
26904
26905
26906
26907
26908
26909
26910
26911
26912
26913
26914
26915
26916
26917
26918
26919
26920
26921
26922
26923
26924
26925
26926
26927
26928
26929
26930
26931
26932
26933
26934
26935
26936
26937
26938
26939
26940
26941
26942
26943
26944
26945
26946
26947
26948
26949
26950
26951
26952
26953
26954
26955
26956
26957
26958
26959
26960
26961
26962
26963
26964
26965
26966
26967
26968
26969
26970
26971
26972
26973
26974
26975
26976
26977
26978
26979
26980
26981
26982
26983
26984
26985
26986
26987
26988
26989
26990
26991
26992
26993
26994
26995
26996
26997
26998
26999
27000
27001
27002
27003
27004
27005
27006
27007
27008
27009
27010
27011
27012
27013
27014
27015
27016
27017
27018
27019
27020
27021
27022
27023
27024
27025
27026
27027
27028
27029
27030
27031
27032
27033
27034
27035
27036
27037
27038
27039
27040
27041
27042
27043
27044
27045
27046
27047
27048
27049
27050
27051
27052
27053
27054
27055
27056
27057
27058
27059
27060
27061
27062
27063
27064
27065
27066
27067
27068
27069
27070
27071
27072
27073
27074
27075
27076
27077
27078
27079
27080
27081
27082
27083
27084
27085
27086
27087
27088
27089
27090
27091
27092
27093
27094
27095
27096
27097
27098
27099
27100
27101
27102
27103
27104
27105
27106
27107
27108
27109
27110
27111
27112
27113
27114
27115
27116
27117
27118
27119
27120
27121
27122
27123
27124
27125
27126
27127
27128
27129
27130
27131
27132
27133
27134
27135
27136
27137
27138
27139
27140
27141
27142
27143
27144
27145
27146
27147
27148
27149
27150
27151
27152
27153
27154
27155
27156
27157
27158
27159
27160
27161
27162
27163
27164
27165
27166
27167
27168
27169
27170
27171
27172
27173
27174
27175
27176
27177
27178
27179
27180
27181
27182
27183
27184
27185
27186
27187
27188
27189
27190
27191
27192
27193
27194
27195
27196
27197
27198
27199
27200
27201
27202
27203
27204
27205
27206
27207
27208
27209
27210
27211
27212
27213
27214
27215
27216
27217
27218
27219
27220
27221
27222
27223
27224
27225
27226
27227
27228
27229
27230
27231
27232
27233
27234
27235
27236
27237
27238
27239
27240
27241
27242
27243
27244
27245
27246
27247
27248
27249
27250
27251
27252
27253
27254
27255
27256
27257
27258
27259
27260
27261
27262
27263
27264
27265
27266
27267
27268
27269
27270
27271
27272
27273
27274
27275
27276
27277
27278
27279
27280
27281
27282
27283
27284
27285
27286
27287
27288
27289
27290
27291
27292
27293
27294
27295
27296
27297
27298
27299
27300
27301
27302
27303
27304
27305
27306
27307
27308
27309
27310
27311
27312
27313
27314
27315
27316
27317
27318
27319
27320
27321
27322
27323
27324
27325
27326
27327
27328
27329
27330
27331
27332
27333
27334
27335
27336
27337
27338
27339
27340
27341
27342
27343
27344
27345
27346
27347
27348
27349
27350
27351
27352
27353
27354
27355
27356
27357
27358
27359
27360
27361
27362
27363
27364
27365
27366
27367
27368
27369
27370
27371
27372
27373
27374
27375
27376
27377
27378
27379
27380
27381
27382
27383
27384
27385
27386
27387
27388
27389
27390
27391
27392
27393
27394
27395
27396
27397
27398
27399
27400
27401
27402
27403
27404
27405
27406
27407
27408
27409
27410
27411
27412
27413
27414
27415
27416
27417
27418
27419
27420
27421
27422
27423
27424
27425
27426
27427
27428
27429
27430
27431
27432
27433
27434
27435
27436
27437
27438
27439
27440
27441
27442
27443
27444
27445
27446
27447
27448
27449
27450
27451
27452
27453
27454
27455
27456
27457
27458
27459
27460
27461
27462
27463
27464
27465
27466
27467
27468
27469
27470
27471
27472
27473
27474
27475
27476
27477
27478
27479
27480
27481
27482
27483
27484
27485
27486
27487
27488
27489
27490
27491
27492
27493
27494
27495
27496
27497
27498
27499
27500
27501
27502
27503
27504
27505
27506
27507
27508
27509
27510
27511
27512
27513
27514
27515
27516
27517
27518
27519
27520
27521
27522
27523
27524
27525
27526
27527
27528
27529
27530
27531
27532
27533
27534
27535
27536
27537
27538
27539
27540
27541
27542
27543
27544
27545
27546
27547
27548
27549
27550
27551
27552
27553
27554
27555
27556
27557
27558
27559
27560
27561
27562
27563
27564
27565
27566
27567
27568
27569
27570
27571
27572
27573
27574
27575
27576
27577
27578
27579
27580
27581
27582
27583
27584
27585
27586
27587
27588
27589
27590
27591
27592
27593
27594
27595
27596
27597
27598
27599
27600
27601
27602
27603
27604
27605
27606
27607
27608
27609
27610
27611
27612
27613
27614
27615
27616
27617
27618
27619
27620
27621
27622
27623
27624
27625
27626
27627
27628
27629
27630
27631
27632
27633
27634
27635
27636
27637
27638
27639
27640
27641
27642
27643
27644
27645
27646
27647
27648
27649
27650
27651
27652
27653
27654
27655
27656
27657
27658
27659
27660
27661
27662
27663
27664
27665
27666
27667
27668
27669
27670
27671
27672
27673
27674
27675
27676
27677
27678
27679
27680
27681
27682
27683
27684
27685
27686
27687
27688
27689
27690
27691
27692
27693
27694
27695
27696
27697
27698
27699
27700
27701
27702
27703
27704
27705
27706
27707
27708
27709
27710
27711
27712
27713
27714
27715
27716
27717
27718
27719
27720
27721
27722
27723
27724
27725
27726
27727
27728
27729
27730
27731
27732
27733
27734
27735
27736
27737
27738
27739
27740
27741
27742
27743
27744
27745
27746
27747
27748
27749
27750
27751
27752
27753
27754
27755
27756
27757
27758
27759
27760
27761
27762
27763
27764
27765
27766
27767
27768
27769
27770
27771
27772
27773
27774
27775
27776
27777
27778
27779
27780
27781
27782
27783
27784
27785
27786
27787
27788
27789
27790
27791
27792
27793
27794
27795
27796
27797
27798
27799
27800
27801
27802
27803
27804
27805
27806
27807
27808
27809
27810
27811
27812
27813
27814
27815
27816
27817
27818
27819
27820
27821
27822
27823
27824
27825
27826
27827
27828
27829
27830
27831
27832
27833
27834
27835
27836
27837
27838
27839
27840
27841
27842
27843
27844
27845
27846
27847
27848
27849
27850
27851
27852
27853
27854
27855
27856
27857
27858
27859
27860
27861
27862
27863
27864
27865
27866
27867
27868
27869
27870
27871
27872
27873
27874
27875
27876
27877
27878
27879
27880
27881
27882
27883
27884
27885
27886
27887
27888
27889
27890
27891
27892
27893
27894
27895
27896
27897
27898
27899
27900
27901
27902
27903
27904
27905
27906
27907
27908
27909
27910
27911
27912
27913
27914
27915
27916
27917
27918
27919
27920
27921
27922
27923
27924
27925
27926
27927
27928
27929
27930
27931
27932
27933
27934
27935
27936
27937
27938
27939
27940
27941
27942
27943
27944
27945
27946
27947
27948
27949
27950
27951
27952
27953
27954
27955
27956
27957
27958
27959
27960
27961
27962
27963
27964
27965
27966
27967
27968
27969
27970
27971
27972
27973
27974
27975
27976
27977
27978
27979
27980
27981
27982
27983
27984
27985
27986
27987
27988
27989
27990
27991
27992
27993
27994
27995
27996
27997
27998
27999
28000
28001
28002
28003
28004
28005
28006
28007
28008
28009
28010
28011
28012
28013
28014
28015
28016
28017
28018
28019
28020
28021
28022
28023
28024
28025
28026
28027
28028
28029
28030
28031
28032
28033
28034
28035
28036
28037
28038
28039
28040
28041
28042
28043
28044
28045
28046
28047
28048
28049
28050
28051
28052
28053
28054
28055
28056
28057
28058
28059
28060
28061
28062
28063
28064
28065
28066
28067
28068
28069
28070
28071
28072
28073
28074
28075
28076
28077
28078
28079
28080
28081
28082
28083
28084
28085
28086
28087
28088
28089
28090
28091
28092
28093
28094
28095
28096
28097
28098
28099
28100
28101
28102
28103
28104
28105
28106
28107
28108
28109
28110
28111
28112
28113
28114
28115
28116
28117
28118
28119
28120
28121
28122
28123
28124
28125
28126
28127
28128
28129
28130
28131
28132
28133
28134
28135
28136
28137
28138
28139
28140
28141
28142
28143
28144
28145
28146
28147
28148
28149
28150
28151
28152
28153
28154
28155
28156
28157
28158
28159
28160
28161
28162
28163
28164
28165
28166
28167
28168
28169
28170
28171
28172
28173
28174
28175
28176
28177
28178
28179
28180
28181
28182
28183
28184
28185
28186
28187
28188
28189
28190
28191
28192
28193
28194
28195
28196
28197
28198
28199
28200
28201
28202
28203
28204
28205
28206
28207
28208
28209
28210
28211
28212
28213
28214
28215
28216
28217
28218
28219
28220
28221
28222
28223
28224
28225
28226
28227
28228
28229
28230
28231
28232
28233
28234
28235
28236
28237
28238
28239
28240
28241
28242
28243
28244
28245
28246
28247
28248
28249
28250
28251
28252
28253
28254
28255
28256
28257
28258
28259
28260
28261
28262
28263
28264
28265
28266
28267
28268
28269
28270
28271
28272
28273
28274
28275
28276
28277
28278
28279
28280
28281
28282
28283
28284
28285
28286
28287
28288
28289
28290
28291
28292
28293
28294
28295
28296
28297
28298
28299
28300
28301
28302
28303
28304
28305
28306
28307
28308
28309
28310
28311
28312
28313
28314
28315
28316
28317
28318
28319
28320
28321
28322
28323
28324
28325
28326
28327
28328
28329
28330
28331
28332
28333
28334
28335
28336
28337
28338
28339
28340
28341
28342
28343
28344
28345
28346
28347
28348
28349
28350
28351
28352
28353
28354
28355
28356
28357
28358
28359
28360
28361
28362
28363
28364
28365
28366
28367
28368
28369
28370
28371
28372
28373
28374
28375
28376
28377
28378
28379
28380
28381
28382
28383
28384
28385
28386
28387
28388
28389
28390
28391
28392
28393
28394
28395
28396
28397
28398
28399
28400
28401
28402
28403
28404
28405
28406
28407
28408
28409
28410
28411
28412
28413
28414
28415
28416
28417
28418
28419
28420
28421
28422
28423
28424
28425
28426
28427
28428
28429
28430
28431
28432
28433
28434
28435
28436
28437
28438
28439
28440
28441
28442
28443
28444
28445
28446
28447
28448
28449
28450
28451
28452
28453
28454
28455
28456
28457
28458
28459
28460
28461
28462
28463
28464
28465
28466
28467
28468
28469
28470
28471
28472
28473
28474
28475
28476
28477
28478
28479
28480
28481
28482
28483
28484
28485
28486
28487
28488
28489
28490
28491
28492
28493
28494
28495
28496
28497
28498
28499
28500
28501
28502
28503
28504
28505
28506
28507
28508
28509
28510
28511
28512
28513
28514
28515
28516
28517
28518
28519
28520
28521
28522
28523
28524
28525
28526
28527
28528
28529
28530
28531
28532
28533
28534
28535
28536
28537
28538
28539
28540
28541
28542
28543
28544
28545
28546
28547
28548
28549
28550
28551
28552
28553
28554
28555
28556
28557
28558
28559
28560
28561
28562
28563
28564
28565
28566
28567
28568
28569
28570
28571
28572
28573
28574
28575
28576
28577
28578
28579
28580
28581
28582
28583
28584
28585
28586
28587
28588
28589
28590
28591
28592
28593
28594
28595
28596
28597
28598
28599
28600
28601
28602
28603
28604
28605
28606
28607
28608
28609
28610
28611
28612
28613
28614
28615
28616
28617
28618
28619
28620
28621
28622
28623
28624
28625
28626
28627
28628
28629
28630
28631
28632
28633
28634
28635
28636
28637
28638
28639
28640
28641
28642
28643
28644
28645
28646
28647
28648
28649
28650
28651
28652
28653
28654
28655
28656
28657
28658
28659
28660
28661
28662
28663
28664
28665
28666
28667
28668
28669
28670
28671
28672
28673
28674
28675
28676
28677
28678
28679
28680
28681
28682
28683
28684
28685
28686
28687
28688
28689
28690
28691
28692
28693
28694
28695
28696
28697
28698
28699
28700
28701
28702
28703
28704
28705
28706
28707
28708
28709
28710
28711
28712
28713
28714
28715
28716
28717
28718
28719
28720
28721
28722
28723
28724
28725
28726
28727
28728
28729
28730
28731
28732
28733
28734
28735
28736
28737
28738
28739
28740
28741
28742
28743
28744
28745
28746
28747
28748
28749
28750
28751
28752
28753
28754
28755
28756
28757
28758
28759
28760
28761
28762
28763
28764
28765
28766
28767
28768
28769
28770
28771
28772
28773
28774
28775
28776
28777
28778
28779
28780
28781
28782
28783
28784
28785
28786
28787
28788
28789
28790
28791
28792
28793
28794
28795
28796
28797
28798
28799
28800
28801
28802
28803
28804
28805
28806
28807
28808
28809
28810
28811
28812
28813
28814
28815
28816
28817
28818
28819
28820
28821
28822
28823
28824
28825
28826
28827
28828
28829
28830
28831
28832
28833
28834
28835
28836
28837
28838
28839
28840
28841
28842
28843
28844
28845
28846
28847
28848
28849
28850
28851
28852
28853
28854
28855
28856
28857
28858
28859
28860
28861
28862
28863
28864
28865
28866
28867
28868
28869
28870
28871
28872
28873
28874
28875
28876
28877
28878
28879
28880
28881
28882
28883
28884
28885
28886
28887
28888
28889
28890
28891
28892
28893
28894
28895
28896
28897
28898
28899
28900
28901
28902
28903
28904
28905
28906
28907
28908
28909
28910
28911
28912
28913
28914
28915
28916
28917
28918
28919
28920
28921
28922
28923
28924
28925
28926
28927
28928
28929
28930
28931
28932
28933
28934
28935
28936
28937
28938
28939
28940
28941
28942
28943
28944
28945
28946
28947
28948
28949
28950
28951
28952
28953
28954
28955
28956
28957
28958
28959
28960
28961
28962
28963
28964
28965
28966
28967
28968
28969
28970
28971
28972
28973
28974
28975
28976
28977
28978
28979
28980
28981
28982
28983
28984
28985
28986
28987
28988
28989
28990
28991
28992
28993
28994
28995
28996
28997
28998
28999
29000
29001
29002
29003
29004
29005
29006
29007
29008
29009
29010
29011
29012
29013
29014
29015
29016
29017
29018
29019
29020
29021
29022
29023
29024
29025
29026
29027
29028
29029
29030
29031
29032
29033
29034
29035
29036
29037
29038
29039
29040
29041
29042
29043
29044
29045
29046
29047
29048
29049
29050
29051
29052
29053
29054
29055
29056
29057
29058
29059
29060
29061
29062
29063
29064
29065
29066
29067
29068
29069
29070
29071
29072
29073
29074
29075
29076
29077
29078
29079
29080
29081
29082
29083
29084
29085
29086
29087
29088
29089
29090
29091
29092
29093
29094
29095
29096
29097
29098
29099
29100
29101
29102
29103
29104
29105
29106
29107
29108
29109
29110
29111
29112
29113
29114
29115
29116
29117
29118
29119
29120
29121
29122
29123
29124
29125
29126
29127
29128
29129
29130
29131
29132
29133
29134
29135
29136
29137
29138
29139
29140
29141
29142
29143
29144
29145
29146
29147
29148
29149
29150
29151
29152
29153
29154
29155
29156
29157
29158
29159
29160
29161
29162
29163
29164
29165
29166
29167
29168
29169
29170
29171
29172
29173
29174
29175
29176
29177
29178
29179
29180
29181
29182
29183
29184
29185
29186
29187
29188
29189
29190
29191
29192
29193
29194
29195
29196
29197
29198
29199
29200
29201
29202
29203
29204
29205
29206
29207
29208
29209
29210
29211
29212
29213
29214
29215
29216
29217
29218
29219
29220
29221
29222
29223
29224
29225
29226
29227
29228
29229
29230
29231
29232
29233
29234
29235
29236
29237
29238
29239
29240
29241
29242
29243
29244
29245
29246
29247
29248
29249
29250
29251
29252
29253
29254
29255
29256
29257
29258
29259
29260
29261
29262
29263
29264
29265
29266
29267
29268
29269
29270
29271
29272
29273
29274
29275
29276
29277
29278
29279
29280
29281
29282
29283
29284
29285
29286
29287
29288
29289
29290
29291
29292
29293
29294
29295
29296
29297
29298
29299
29300
29301
29302
29303
29304
29305
29306
29307
29308
29309
29310
29311
29312
29313
29314
29315
29316
29317
29318
29319
29320
29321
29322
29323
29324
29325
29326
29327
29328
29329
29330
29331
29332
29333
29334
29335
29336
29337
29338
29339
29340
29341
29342
29343
29344
29345
29346
29347
29348
29349
29350
29351
29352
29353
29354
29355
29356
29357
29358
29359
29360
29361
29362
29363
29364
29365
29366
29367
29368
29369
29370
29371
29372
29373
29374
29375
29376
29377
29378
29379
29380
29381
29382
29383
29384
29385
29386
29387
29388
29389
29390
29391
29392
29393
29394
29395
29396
29397
29398
29399
29400
29401
29402
29403
29404
29405
29406
29407
29408
29409
29410
29411
29412
29413
29414
29415
29416
29417
29418
29419
29420
29421
29422
29423
29424
29425
29426
29427
29428
29429
29430
29431
29432
29433
29434
29435
29436
29437
29438
29439
29440
29441
29442
29443
29444
29445
29446
29447
29448
29449
29450
29451
29452
29453
29454
29455
29456
29457
29458
29459
29460
29461
29462
29463
29464
29465
29466
29467
29468
29469
29470
29471
29472
29473
29474
29475
29476
29477
29478
29479
29480
29481
29482
29483
29484
29485
29486
29487
29488
29489
29490
29491
29492
29493
29494
29495
29496
29497
29498
29499
29500
29501
29502
29503
29504
29505
29506
29507
29508
29509
29510
29511
29512
29513
29514
29515
29516
29517
29518
29519
29520
29521
29522
29523
29524
29525
29526
29527
29528
29529
29530
29531
29532
29533
29534
29535
29536
29537
29538
29539
29540
29541
29542
29543
29544
29545
29546
29547
29548
29549
29550
29551
29552
29553
29554
29555
29556
29557
29558
29559
29560
29561
29562
29563
29564
29565
29566
29567
29568
29569
29570
29571
29572
29573
29574
29575
29576
29577
29578
29579
29580
29581
29582
29583
29584
29585
29586
29587
29588
29589
29590
29591
29592
29593
29594
29595
29596
29597
29598
29599
29600
29601
29602
29603
29604
29605
29606
29607
29608
29609
29610
29611
29612
29613
29614
29615
29616
29617
29618
29619
29620
29621
29622
29623
29624
29625
29626
29627
29628
29629
29630
29631
29632
29633
29634
29635
29636
29637
29638
29639
29640
29641
29642
29643
29644
29645
29646
29647
29648
29649
29650
29651
29652
29653
29654
29655
29656
29657
29658
29659
29660
29661
29662
29663
29664
29665
29666
29667
29668
29669
29670
29671
29672
29673
29674
29675
29676
29677
29678
29679
29680
29681
29682
29683
29684
29685
29686
29687
29688
29689
29690
29691
29692
29693
29694
29695
29696
29697
29698
29699
29700
29701
29702
29703
29704
29705
29706
29707
29708
29709
29710
29711
29712
29713
29714
29715
29716
29717
29718
29719
29720
29721
29722
29723
29724
29725
29726
29727
29728
29729
29730
29731
29732
29733
29734
29735
29736
29737
29738
29739
29740
29741
29742
29743
29744
29745
29746
29747
29748
29749
29750
29751
29752
29753
29754
29755
29756
29757
29758
29759
29760
29761
29762
29763
29764
29765
29766
29767
29768
29769
29770
29771
29772
29773
29774
29775
29776
29777
29778
29779
29780
29781
29782
29783
29784
29785
29786
29787
29788
29789
29790
29791
29792
29793
29794
29795
29796
29797
29798
29799
29800
29801
29802
29803
29804
29805
29806
29807
29808
29809
29810
29811
29812
29813
29814
29815
29816
29817
29818
29819
29820
29821
29822
29823
29824
29825
29826
29827
29828
29829
29830
29831
29832
29833
29834
29835
29836
29837
29838
29839
29840
29841
29842
29843
29844
29845
29846
29847
29848
29849
29850
29851
29852
29853
29854
29855
29856
29857
29858
29859
29860
29861
29862
29863
29864
29865
29866
29867
29868
29869
29870
29871
29872
29873
29874
29875
29876
29877
29878
29879
29880
29881
29882
29883
29884
29885
29886
29887
29888
29889
29890
29891
29892
29893
29894
29895
29896
29897
29898
29899
29900
29901
29902
29903
29904
29905
29906
29907
29908
29909
29910
29911
29912
29913
29914
29915
29916
29917
29918
29919
29920
29921
29922
29923
29924
29925
29926
29927
29928
29929
29930
29931
29932
29933
29934
29935
29936
29937
29938
29939
29940
29941
29942
29943
29944
29945
29946
29947
29948
29949
29950
29951
29952
29953
29954
29955
29956
29957
29958
29959
29960
29961
29962
29963
29964
29965
29966
29967
29968
29969
29970
29971
29972
29973
29974
29975
29976
29977
29978
29979
29980
29981
29982
29983
29984
29985
29986
29987
29988
29989
29990
29991
29992
29993
29994
29995
29996
29997
29998
29999
30000
30001
30002
30003
30004
30005
30006
30007
30008
30009
30010
30011
30012
30013
30014
30015
30016
30017
30018
30019
30020
30021
30022
30023
30024
30025
30026
30027
30028
30029
30030
30031
30032
30033
30034
30035
30036
30037
30038
30039
30040
30041
30042
30043
30044
30045
30046
30047
30048
30049
30050
30051
30052
30053
30054
30055
30056
30057
30058
30059
30060
30061
30062
30063
30064
30065
30066
30067
30068
30069
30070
30071
30072
30073
30074
30075
30076
30077
30078
30079
30080
30081
30082
30083
30084
30085
30086
30087
30088
30089
30090
30091
30092
30093
30094
30095
30096
30097
30098
30099
30100
30101
30102
30103
30104
30105
30106
30107
30108
30109
30110
30111
30112
30113
30114
30115
30116
30117
30118
30119
30120
30121
30122
30123
30124
30125
30126
30127
30128
30129
30130
30131
30132
30133
30134
30135
30136
30137
30138
30139
30140
30141
30142
30143
30144
30145
30146
30147
30148
30149
30150
30151
30152
30153
30154
30155
30156
30157
30158
30159
30160
30161
30162
30163
30164
30165
30166
30167
30168
30169
30170
30171
30172
30173
30174
30175
30176
30177
30178
30179
30180
30181
30182
30183
30184
30185
30186
30187
30188
30189
30190
30191
30192
30193
30194
30195
30196
30197
30198
30199
30200
30201
30202
30203
30204
30205
30206
30207
30208
30209
30210
30211
30212
30213
30214
30215
30216
30217
30218
30219
30220
30221
30222
30223
30224
30225
30226
30227
| patientId | class | x | y | width | height | Target | patientAge | patientSex | ViewPosition | StudyID | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 51 | F | PA | |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 48 | F | PA | |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 19 | M | AP | |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal | NaN | NaN | NaN | NaN | 0 | 28 | M | PA | |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity | 264.0 | 152.0 | 213.0 | 379.0 | 1 | 32 | F | AP | |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity | 562.0 | 152.0 | 256.0 | 453.0 | 1 | 32 | F | AP | |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 54 | M | AP | |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 78 | M | PA | |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity | 323.0 | 577.0 | 160.0 | 104.0 | 1 | 75 | M | PA | |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity | 695.0 | 575.0 | 162.0 | 137.0 | 1 | 75 | M | PA |
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
age_effect = sns.distplot(combined[combined['Target'] == 1]['patientAge'], hist = True, kde = False, color = 'green', label = 'Pneumonia Positive')
age_effect_count = age_effect.set_ylabel('Count')
age_effect_count = age_effect.set_title('Age Effect on Pneumonia')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
age_effect = sns.distplot(combined[combined['Target'] == 0]['patientAge'], hist = True, kde = False, color = 'green', label = 'Pneumonia Negative')
age_effect_count = age_effect.set_ylabel('Count')
age_effect_count = age_effect.set_title('Age Effect on Pneumonia')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
Normal Data Distribution
neg = sns.distplot(combined[combined['class'] == 'Normal']['patientAge'], hist = True, color = 'blue', kde = False)
normal_patient = neg.set_ylabel('Count')
normal_patient = neg.set_xlabel('Patient Age')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
No Lung Opacity / Not Normal Data Distribution
abnormal_patient = sns.distplot(combined[combined['class'] == 'No Lung Opacity / Not Normal']['patientAge'], hist = True, color = 'green',kde = False)
abnormal_patient_info = abnormal_patient.set_ylabel('Count')
abnormal_patient_info = abnormal_patient.set_xlabel('Patient Age')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
Lung Opacity Data Distribution
opacity = sns.distplot(combined[combined['class'] == 'Lung Opacity']['patientAge'], hist=True, color = 'red',kde = False)
opacity_info = opacity.set_ylabel('Count')
opacity_info = opacity.set_xlabel('Patient Age')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
sex = sns.countplot(x = 'Target', hue = 'patientSex', data = combined)
sex_info = sex.set_title('Patient Gender')
Observation:
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
sex_class = sns.countplot(x = 'patientSex', hue = 'class', data = combined)
sex_class_info = sex_class.set_title('Patient Sex Class')
Observation:
fig, axes = plt.subplots(1, 1, figsize=(20, 7))
age_gender_impact = sns.countplot(x = 'patientAge', hue='patientSex', data = combined, order = combined['patientAge'].value_counts().index)
impact_info = age_gender_impact.set_title('Patient Gender and Age')
Observation:
pneumonia_patients = combined[combined['Target'] == 1]
print(pneumonia_patients.shape)
pneumonia_patients_sample = pneumonia_patients.sample(9555)
pneumonia_patients_sample['xc'] = pneumonia_patients_sample['x'] + pneumonia_patients_sample['width'] / 2
pneumonia_patients_sample['yc'] = pneumonia_patients_sample['y'] + pneumonia_patients_sample['height'] / 2
(9555, 11)
pneumonia_patients_sample['patientAge'] = pneumonia_patients_sample['patientAge'].astype(int)
age_groupA = pneumonia_patients_sample[pneumonia_patients_sample['patientAge'] < 20]
age_groupB = pneumonia_patients_sample[(pneumonia_patients_sample['patientAge'] >= 20) & (pneumonia_patients_sample['patientAge'] < 35)]
age_groupC = pneumonia_patients_sample[(pneumonia_patients_sample['patientAge'] >= 35) & (pneumonia_patients_sample['patientAge'] < 50)]
age_groupD = pneumonia_patients_sample[(pneumonia_patients_sample['patientAge'] >= 50) & (pneumonia_patients_sample['patientAge'] < 65)]
age_groupE = pneumonia_patients_sample[pneumonia_patients_sample['patientAge'] >= 65]
def plot_data(data,color_point, color_window,text):
fig, ax = plt.subplots(1,1,figsize = (7,7))
plt.title("Centers of Lung Opacity\n{}".format(text))
data.plot.scatter(x = 'xc', y = 'yc', xlim = (0,1024), ylim = (0,1024), ax = ax, alpha = 0.8, marker = ".", color = color_point)
for i, crt_sample in data.iterrows():
ax.add_patch(Rectangle(xy = (crt_sample['x'], crt_sample['y']),
width=crt_sample['width'],height = crt_sample['height'],alpha=3.5e-3, color = color_window))
plt.show()
plot_data(age_groupA,'blue', 'Orange', 'Patient Age: 1-19 years')
plot_data(age_groupB,'blue', 'Orange', 'Patient Age: 20-34 years')
plot_data(age_groupC,'blue', 'orange', 'Patient Age: 35-49 years')
plot_data(age_groupD,'blue', 'orange', 'Patient Age: 50-64 years')
plot_data(age_groupE,'blue', 'orange', 'Patient Age: >65 years')
Observation:
Class
fig, axes = plt.subplots(1, 1, figsize = (7, 7))
VP_vs_Class = sns.countplot(x = 'ViewPosition', hue = 'class', data = combined)
VP_vs_Class_info = VP_vs_Class.set_title('View Positon Vs Class')
Observation:
Target
fig, axes = plt.subplots(1, 1, figsize = (7, 7))
VP_vs_Target = sns.countplot(x='Target', hue = 'ViewPosition', data = combined)
VP_vs_Target_info = VP_vs_Target.set_title('View Position Vs Target')
Observation:
Gender Wise
fig, axes = plt.subplots(1, 1, figsize = (7, 7))
VP_vs_Sex = sns.countplot(x='ViewPosition', hue='patientSex', data = combined)
VP_vs_Sex_info = VP_vs_Sex.set_title('View Position Vs Patient Sex')
Observation:
Age wise
fig, axes = plt.subplots(1, 1, figsize=(7, 7))
Age_vs_viewposn = sns.distplot(combined[combined['ViewPosition']=='AP']['patientAge'], hist=True, kde=False, color='red', label='AP')
Age_vs_viewposn = sns.distplot(combined[combined['ViewPosition']=='PA']['patientAge'], hist=True, kde=False, color='green', label='PA')
Age_vs_viewposn_count = Age_vs_viewposn.set_ylabel('Count')
Age_vs_viewposn_count = Age_vs_viewposn.legend()
Age_vs_viewposn_count = Age_vs_viewposn.set_title('View Position Vs Age')
/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms). warnings.warn(msg, FutureWarning)
Observation:
# Function to Read DCM Image
def read_image(patientId):
train_images = project_path + '/stage_2_train_images/%s.dcm' % patientId
dcm = pydicom.read_file(train_images)
return dcm
def image_grid(df, pid_sample_list, nrows=3, ncols=3, draw_bbox=True, ax_off=True):
fig = plt.figure(figsize=(16, 12))
for i in range(nrows * ncols):
patient_id = pid_sample_list[i]
img = read_image(patient_id).pixel_array
ax = fig.add_subplot(nrows, ncols, i + 1)
plt.imshow(img, cmap='gray')
ax.set_title(patient_id)
if ax_off:
ax.set_axis_off()
if draw_bbox:
bbox_rows = combined[combined['patientId'] == patient_id]
for _, row in bbox_rows.iterrows():
x, y = row['x'], row['y']
width, height = row['width'], row['height']
bbox = patches.Rectangle((x, y), width, height, linewidth=.5, edgecolor='r', facecolor='none')
ax.add_patch(bbox)
plt.tight_layout()
plt.subplots_adjust(wspace=.01, hspace=.01)
return fig
#DICOM image with class label "No Lung Opacity / Not Normal"
print('Label: No Lung Opacity / Not Normal')
print(class_info['patientId'][0])
filename = class_info['patientId'][0] + '.dcm'
filename = (os.path.join(project_path,'stage_2_train_images',filename))
dataset = pyd.dcmread(filename)
plt.imshow(dataset.pixel_array, cmap=plt.cm.bone)
plt.show()
Label: No Lung Opacity / Not Normal 0004cfab-14fd-4e49-80ba-63a80b6bddd6
#DICOM image with class label "Normal"
print('Label: Normal')
print(class_info['patientId'][3])
filename = class_info['patientId'][3] + '.dcm'
filename = (os.path.join(project_path,'stage_2_train_images',filename))
dataset = pyd.dcmread(filename)
plt.imshow(dataset.pixel_array, cmap=plt.cm.bone)
plt.show()
Label: Normal 003d8fa0-6bf1-40ed-b54c-ac657f8495c5
#DICOM image with class label "Lung Opacity"
from matplotlib.patches import Rectangle
print('Label: Lung Opacity')
print(labels['patientId'][4])
filename = labels['patientId'][4] + '.dcm'
filename = (os.path.join(project_path,'stage_2_train_images',filename))
dataset = pyd.dcmread(filename)
plt.imshow(dataset.pixel_array, cmap=plt.cm.bone)
bb = Rectangle((labels['x'][4], labels['y'][4]), labels['width'][4], labels['height'][4], fill=False, color='red')
plt.axes().add_patch(bb)
plt.show()
Label: Lung Opacity 00436515-870c-4b36-a041-de91049b9ab4
/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:17: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance.
import pydicom
xrays = combined[combined['ViewPosition']=='PA']['patientId'].sample(20).tolist()
xrays_grid = image_grid(combined, xrays, nrows=2, ncols=3)
xrays = combined[combined['ViewPosition']=='AP']['patientId'].sample(20).tolist()
xrays_grid = image_grid(combined, xrays, nrows=2, ncols=3)
##Splitting into training and validation dataset for input into data generator functions
from sklearn.model_selection import train_test_split
df = combined[['patientId','Target']]
df = df.drop_duplicates()
patientId = df['patientId']
Target = df['Target']
X_train, X_val, Y_train, Y_val = train_test_split(patientId,Target, test_size=0.4, random_state=42)
##Print the distribution of labels between the training and validation dataset
print("Ratio of Pnuemonia to Non-Pnuemonia Labels in training dataset is: {}".format(round(Y_train.value_counts()[1] \
/len(Y_train),2)))
print("Ratio of Pnuemonia to Non-Pnuemonia Labels in validation dataset is: {}".format(round(Y_val.value_counts()[1] \
/len(Y_val),2)))
print("No. of records in training dataset is: {}".format(len(X_train)))
print("No. of records in validation dataset is: {}".format(len(X_val)))
Ratio of Pnuemonia to Non-Pnuemonia Labels in training dataset is: 0.22 Ratio of Pnuemonia to Non-Pnuemonia Labels in validation dataset is: 0.23 No. of records in training dataset is: 16010 No. of records in validation dataset is: 10674
Data Processing
Extract Data from DICOM file
combined_sample = combined.sample(n=30227)
temp_data_directory = project_path + 'pneumonia_status'
print(temp_data_directory)
os.mkdir(f'{temp_data_directory}')
os.mkdir(f'{temp_data_directory}/positive')
os.mkdir(f'{temp_data_directory}/negative')
/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status
for value in combined_sample['patientId']:
!cp '/content/drive/MyDrive/Colab Notebooks/Capstone Project/stage_2_train_images/{value}.dcm' '/content/drive/MyDrive/Colab Notebooks/Capstone Project/project_path/'
combined_sample['path']=f'{temp_data_directory}'+'/'+combined_sample['patientId'].astype(str)+'.dcm'
print(temp_data_directory)
/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status
import pydicom
age, gender, view_posn, study_id = [], [], [], []
combined_sample['age'] = 0
combined_sample['sex'] = ''
combined_sample['ViewPosition'] = ''
combined_sample['StudyID'] = ''
counter = 0
for value in combined_sample['patientId']:
counter = counter + 1
patient_Target = (pd.to_numeric(combined_sample[combined_sample['patientId'] == value]['Target'],downcast ='signed', errors='coerce')) < 1
dcm_patientFile = '/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/%s.dcm' % value
dcm_patientData = pydicom.read_file(dcm_patientFile)
age.append(dcm_patientData.PatientAge)
gender.append(dcm_patientData.PatientSex)
view_posn.append(dcm_patientData.ViewPosition)
study_id.append(dcm_patientData.StudyID)
combined_sample['age'] = age
combined_sample['sex'] = gender
combined_sample['ViewPosition'] = view_posn
combined_sample['StudyID'] = study_id
combined_sample.head(10)
| patientId | class | x | y | width | height | Target | patientAge | patientSex | ViewPosition | StudyID | path | age | sex | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 4368 | 3e77e593-74c8-4f9c-aa09-8e0cabffd059 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 57 | F | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 57 | F | |
| 23045 | cfc035aa-7dc6-418a-a70d-21f0a924c6fb | Lung Opacity | 563.0 | 325.0 | 239.0 | 317.0 | 1 | 63 | M | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 63 | M | |
| 22455 | cb05cd95-3e60-47e7-a68c-afaa343d05e3 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 53 | M | PA | /content/drive/MyDrive/Colab Notebooks/Capston... | 53 | M | |
| 6368 | 4e5ba406-f79d-4f99-840a-168bf59dcec4 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 48 | M | PA | /content/drive/MyDrive/Colab Notebooks/Capston... | 48 | M | |
| 16769 | a190459e-99ff-4bf0-9e30-47c2b930ac8a | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 64 | F | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 64 | F | |
| 11172 | 74bf0245-0e19-45d6-9f31-65eaff09fb8a | Lung Opacity | 635.0 | 624.0 | 143.0 | 128.0 | 1 | 52 | F | PA | /content/drive/MyDrive/Colab Notebooks/Capston... | 52 | F | |
| 2116 | 26c15b0a-9865-414d-94b2-5349e8903f88 | Lung Opacity | 222.0 | 478.0 | 248.0 | 236.0 | 1 | 67 | M | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 67 | M | |
| 16159 | 9cb1987f-476b-43a1-aaa1-9fa529333262 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 21 | M | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 21 | M | |
| 2509 | 3251dea8-4f74-4f4b-8f56-167b0213414b | Lung Opacity | 589.0 | 372.0 | 214.0 | 351.0 | 1 | 24 | M | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 24 | M | |
| 9877 | 6ad73a5d-2d72-416f-ac53-01f1bcc5598d | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 77 | M | PA | /content/drive/MyDrive/Colab Notebooks/Capston... | 77 | M |
combined_sample.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 30227 entries, 14311 to 16155 Data columns (total 14 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 patientId 30227 non-null object 1 class 30227 non-null object 2 x 9555 non-null float64 3 y 9555 non-null float64 4 width 9555 non-null float64 5 height 9555 non-null float64 6 Target 30227 non-null int64 7 patientAge 30227 non-null object 8 patientSex 30227 non-null object 9 ViewPosition 30227 non-null object 10 StudyID 30227 non-null object 11 path 30227 non-null object 12 age 30227 non-null object 13 sex 30227 non-null object dtypes: float64(4), int64(1), object(9) memory usage: 3.5+ MB
Splitting Data into relative classes
non_pneumonia = combined_sample[combined_sample['Target']==0]
print(len(non_pneumonia))
non_pneumonia.head()
20672
| patientId | class | x | y | width | height | Target | patientAge | patientSex | ViewPosition | StudyID | path | age | sex | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 4368 | 3e77e593-74c8-4f9c-aa09-8e0cabffd059 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 57 | F | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 57 | F | |
| 22455 | cb05cd95-3e60-47e7-a68c-afaa343d05e3 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 53 | M | PA | /content/drive/MyDrive/Colab Notebooks/Capston... | 53 | M | |
| 6368 | 4e5ba406-f79d-4f99-840a-168bf59dcec4 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 48 | M | PA | /content/drive/MyDrive/Colab Notebooks/Capston... | 48 | M | |
| 16769 | a190459e-99ff-4bf0-9e30-47c2b930ac8a | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 64 | F | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 64 | F | |
| 16159 | 9cb1987f-476b-43a1-aaa1-9fa529333262 | No Lung Opacity / Not Normal | NaN | NaN | NaN | NaN | 0 | 21 | M | AP | /content/drive/MyDrive/Colab Notebooks/Capston... | 21 | M |
pneumonia = combined_sample[combined_sample['Target']==1]
pneumonia_cases = pneumonia[['path','patientId']]
path = pneumonia_cases['path'].unique()
patientId = pneumonia_cases['patientId'].unique()
pneumonia_cases = pd.DataFrame({'path':path,'patientId':patientId})
len(pneumonia_cases)
6012
from tqdm import tqdm
from skimage.transform import resize
for _,row in tqdm(pneumonia_cases.iterrows()):
img=pydicom.read_file(row['path']).pixel_array
img=resize(img,(256,256))
plt.imsave(f'/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/positive/'+row['patientId']+'.jpg',img,cmap='gray')
non_pneumonia = combined_sample[combined_sample['Target']==0]
no_pneumonia = non_pneumonia[['path','patientId']]
path = no_pneumonia['path'].unique()
patientId = no_pneumonia['patientId'].unique()
no_pneumonia = pd.DataFrame({'path':path,'patientId':patientId})
len(no_pneumonia)
20672
for _,row in tqdm(no_pneumonia.iterrows()):
img=pydicom.read_file(row['path']).pixel_array
img=resize(img,(256,256))
plt.imsave(f'/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/negative/'+row['patientId']+'.jpg',img,cmap='gray')
from tensorflow.keras.applications.vgg19 import VGG19,preprocess_input
datagen=ImageDataGenerator(samplewise_center = True, samplewise_std_normalization = True, horizontal_flip = True,
width_shift_range = 0.05, rescale = 1/255, fill_mode = 'nearest', height_shift_range = 0.05,
preprocessing_function = preprocess_input, validation_split = 0.1,
)
train=datagen.flow_from_directory('/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status', color_mode = 'rgb', batch_size = 128, class_mode = 'binary', subset = 'training')
test=datagen.flow_from_directory('/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status', color_mode = 'rgb', batch_size = 32, class_mode = 'binary', subset = 'validation')
Found 24016 images belonging to 2 classes. Found 2668 images belonging to 2 classes.
train.class_indices
{'negative': 0, 'positive': 1}
test.class_indices
{'negative': 0, 'positive': 1}
from keras.applications.vgg19 import VGG19
pre_trained_model = VGG19(input_shape = (256,256,3), include_top = False, weights = 'imagenet')
for layer in pre_trained_model.layers:
layer.trainable = False
pre_trained_model.summary()
last_layer = pre_trained_model.get_layer('block5_pool')
print('last layer output shape: ', last_layer.output_shape)
last_output = last_layer.output
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg19/vgg19_weights_tf_dim_ordering_tf_kernels_notop.h5
80142336/80134624 [==============================] - 1s 0us/step
80150528/80134624 [==============================] - 1s 0us/step
Model: "vgg19"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_2 (InputLayer) [(None, 256, 256, 3)] 0
block1_conv1 (Conv2D) (None, 256, 256, 64) 1792
block1_conv2 (Conv2D) (None, 256, 256, 64) 36928
block1_pool (MaxPooling2D) (None, 128, 128, 64) 0
block2_conv1 (Conv2D) (None, 128, 128, 128) 73856
block2_conv2 (Conv2D) (None, 128, 128, 128) 147584
block2_pool (MaxPooling2D) (None, 64, 64, 128) 0
block3_conv1 (Conv2D) (None, 64, 64, 256) 295168
block3_conv2 (Conv2D) (None, 64, 64, 256) 590080
block3_conv3 (Conv2D) (None, 64, 64, 256) 590080
block3_conv4 (Conv2D) (None, 64, 64, 256) 590080
block3_pool (MaxPooling2D) (None, 32, 32, 256) 0
block4_conv1 (Conv2D) (None, 32, 32, 512) 1180160
block4_conv2 (Conv2D) (None, 32, 32, 512) 2359808
block4_conv3 (Conv2D) (None, 32, 32, 512) 2359808
block4_conv4 (Conv2D) (None, 32, 32, 512) 2359808
block4_pool (MaxPooling2D) (None, 16, 16, 512) 0
block5_conv1 (Conv2D) (None, 16, 16, 512) 2359808
block5_conv2 (Conv2D) (None, 16, 16, 512) 2359808
block5_conv3 (Conv2D) (None, 16, 16, 512) 2359808
block5_conv4 (Conv2D) (None, 16, 16, 512) 2359808
block5_pool (MaxPooling2D) (None, 8, 8, 512) 0
=================================================================
Total params: 20,024,384
Trainable params: 0
Non-trainable params: 20,024,384
_________________________________________________________________
last layer output shape: (None, 8, 8, 512)
from tensorflow.keras.layers import Flatten,Dense,Dropout,BatchNormalization,LeakyReLU,GaussianDropout
model = Flatten()(last_output)
model = Dense(1024)(model)
model=LeakyReLU(0.1)(model)
model=Dropout(0.25)(model)
model=BatchNormalization()(model)
model= Dense(1024)(model)
model=LeakyReLU(0.1)(model)
model=Dropout(0.25)(model)
model=BatchNormalization()(model)
model= Dense(1, activation='sigmoid')(model)
from tensorflow.keras.models import Model
vgg19model = Model(pre_trained_model.input, model)
vgg19model.compile(optimizer = 'adam',
loss = 'binary_crossentropy',
metrics = ['accuracy'])
from tensorflow.keras.callbacks import EarlyStopping,ReduceLROnPlateau
early=EarlyStopping(monitor='accuracy',patience=3,mode='auto')
reduce_lr = ReduceLROnPlateau(monitor='accuracy', factor=0.5, patience=2, verbose=1,cooldown=0, mode='auto',min_delta=0.0001, min_lr=0)
class_weight={0:1,1:3.3}
vgg19model.fit(train, epochs = 20, callbacks = [reduce_lr], steps_per_epoch = 100, validation_data = test, class_weight = class_weight)
Epoch 1/20 100/100 [==============================] - 1709s 17s/step - loss: 1.1270 - accuracy: 0.6859 - val_loss: 0.7460 - val_accuracy: 0.6158 - lr: 0.0010 Epoch 2/20 100/100 [==============================] - 889s 9s/step - loss: 0.8518 - accuracy: 0.7230 - val_loss: 0.4217 - val_accuracy: 0.7935 - lr: 0.0010 Epoch 3/20 100/100 [==============================] - 958s 10s/step - loss: 0.7849 - accuracy: 0.7349 - val_loss: 0.4559 - val_accuracy: 0.7717 - lr: 0.0010 Epoch 4/20 100/100 [==============================] - 1005s 10s/step - loss: 0.7634 - accuracy: 0.7397 - val_loss: 0.4232 - val_accuracy: 0.7980 - lr: 0.0010 Epoch 5/20 100/100 [==============================] - 974s 10s/step - loss: 0.7451 - accuracy: 0.7506 - val_loss: 0.4513 - val_accuracy: 0.7785 - lr: 0.0010 Epoch 6/20 100/100 [==============================] - 948s 9s/step - loss: 0.7397 - accuracy: 0.7553 - val_loss: 0.3899 - val_accuracy: 0.8107 - lr: 0.0010 Epoch 7/20 100/100 [==============================] - 1010s 10s/step - loss: 0.7268 - accuracy: 0.7518 - val_loss: 0.3972 - val_accuracy: 0.8310 - lr: 0.0010 Epoch 8/20 100/100 [==============================] - 947s 9s/step - loss: 0.7338 - accuracy: 0.7572 - val_loss: 0.4062 - val_accuracy: 0.8126 - lr: 0.0010 Epoch 9/20 100/100 [==============================] - 991s 10s/step - loss: 0.7260 - accuracy: 0.7567 - val_loss: 0.4065 - val_accuracy: 0.8118 - lr: 0.0010 Epoch 10/20 100/100 [==============================] - 964s 10s/step - loss: 0.7088 - accuracy: 0.7620 - val_loss: 0.3780 - val_accuracy: 0.8437 - lr: 0.0010 Epoch 11/20 100/100 [==============================] - 1009s 10s/step - loss: 0.7203 - accuracy: 0.7563 - val_loss: 0.3526 - val_accuracy: 0.8471 - lr: 0.0010 Epoch 12/20 100/100 [==============================] - ETA: 0s - loss: 0.7110 - accuracy: 0.7596 Epoch 12: ReduceLROnPlateau reducing learning rate to 0.0005000000237487257. 100/100 [==============================] - 1064s 11s/step - loss: 0.7110 - accuracy: 0.7596 - val_loss: 0.3890 - val_accuracy: 0.8298 - lr: 0.0010 Epoch 13/20 100/100 [==============================] - 995s 10s/step - loss: 0.6942 - accuracy: 0.7698 - val_loss: 0.3680 - val_accuracy: 0.8313 - lr: 5.0000e-04 Epoch 14/20 100/100 [==============================] - 992s 10s/step - loss: 0.6783 - accuracy: 0.7742 - val_loss: 0.4064 - val_accuracy: 0.8190 - lr: 5.0000e-04 Epoch 15/20 100/100 [==============================] - 990s 10s/step - loss: 0.6732 - accuracy: 0.7745 - val_loss: 0.3831 - val_accuracy: 0.8332 - lr: 5.0000e-04 Epoch 16/20 100/100 [==============================] - 982s 10s/step - loss: 0.6771 - accuracy: 0.7664 - val_loss: 0.3568 - val_accuracy: 0.8355 - lr: 5.0000e-04 Epoch 17/20 100/100 [==============================] - 992s 10s/step - loss: 0.6713 - accuracy: 0.7834 - val_loss: 0.3603 - val_accuracy: 0.8373 - lr: 5.0000e-04 Epoch 18/20 100/100 [==============================] - 1022s 10s/step - loss: 0.6696 - accuracy: 0.7737 - val_loss: 0.3742 - val_accuracy: 0.8400 - lr: 5.0000e-04 Epoch 19/20 100/100 [==============================] - ETA: 0s - loss: 0.6708 - accuracy: 0.7815 Epoch 19: ReduceLROnPlateau reducing learning rate to 0.0002500000118743628. 100/100 [==============================] - 968s 10s/step - loss: 0.6708 - accuracy: 0.7815 - val_loss: 0.4215 - val_accuracy: 0.7976 - lr: 5.0000e-04 Epoch 20/20 100/100 [==============================] - 976s 10s/step - loss: 0.6599 - accuracy: 0.7822 - val_loss: 0.3770 - val_accuracy: 0.8306 - lr: 2.5000e-04
<keras.callbacks.History at 0x7fee8bd46610>
Save the Model
vgg19model.save('/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/vgg19_model.h5')
Plotting Accuracy and Validation Accuracy
plt.figure(figsize=(30,20))
val_acc=np.asarray(vgg19model.history.history['val_accuracy'])*100
acc=np.asarray(vgg19model.history.history['accuracy'])*100
acc=pd.DataFrame({'val_acc':val_acc,'acc':acc})
acc.plot(figsize=(20,10),yticks=range(50,100,5))
<matplotlib.axes._subplots.AxesSubplot at 0x7fed8d6c3e50>
<Figure size 2160x1440 with 0 Axes>
Plotting Loss and Validation Loss
loss=vgg19model.history.history['loss']
val_loss=vgg19model.history.history['val_loss']
loss=pd.DataFrame({'val_loss':val_loss,'loss':loss})
loss.plot(figsize=(20,10))
<matplotlib.axes._subplots.AxesSubplot at 0x7fed8d5f3490>
Model testing
Testing with saved Model
from keras.models import load_model
from keras.preprocessing import image
#load the model we saved
Vgg19_loaded_model = load_model('/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/vgg19_model.h5')
Vgg19_loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
y=[]
test.reset()
for i in tqdm(range(4)):
_,target=test.__getitem__(i)
for j in target:
y.append(j)
100%|██████████| 4/4 [00:02<00:00, 1.84it/s]
test.reset()
y_pred=Vgg19_loaded_model.predict(test)
pred=[]
for i in y_pred:
if i[0]>=0.5:
pred.append(1)
else:
pred.append(0)
from sklearn.metrics import roc_curve,auc,precision_recall_curve,classification_report
print(classification_report(y,pred[:len(y)]))
precision recall f1-score support
0.0 0.87 0.89 0.88 94
1.0 0.68 0.62 0.65 34
accuracy 0.82 128
macro avg 0.77 0.76 0.76 128
weighted avg 0.82 0.82 0.82 128
AUC Curve
plt.figure(figsize=(20,10))
fprr,tprr,_=roc_curve(y,y_pred[:len(y)])
area_under_curver=auc(fprr,tprr)
print('The area under the curve is:',area_under_curver)
# Plot area under curve
plt.plot(fprr,tprr,'b.-')
plt.xlabel('False positive rate')
plt.ylabel('True positive rate')
plt.plot(fprr,fprr,linestyle='--',color='black')
The area under the curve is: 0.8269712140175219
[<matplotlib.lines.Line2D at 0x7fed8d44ee90>]
#Setting the dimensions of our images
img_width, img_height = 256,256
def load_image(img_path, show=False):
img = image.load_img(img_path, target_size=(256, 256))
img_tensor = image.img_to_array(img) # (height, width, channels)
img_tensor = np.expand_dims(img_tensor, axis=0) # (1, height, width, channels), add a dimension because the model expects this shape: (batch_size, height, width, channels)
img_tensor /= 255. # imshow expects values in the range [0, 1]
if show:
plt.imshow(img_tensor[0])
plt.axis('off')
plt.show()
return img_tensor
Prediction on Test Image
# image path
img_path = '/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/positive/000db696-cf54-4385-b10b-6b16fbb3f985.jpg' # positive
# load a single image
test_image = load_image(img_path)
# check prediction
prediction = Vgg19_loaded_model.predict(test_image)
print(prediction)
[[0.8421707]]
import keras
from tensorflow.keras import Model
from keras.applications.vgg16 import VGG16
from keras.applications.vgg16 import preprocess_input
from tensorflow.keras.preprocessing.image import ImageDataGenerator
vgg_model = VGG16(input_shape = (256,256,3),include_top = False,weights = 'imagenet')
output = vgg_model.layers[-1].output
output = keras.layers.Flatten()(output)
vgg_model = Model(vgg_model.input, output)
for layer in vgg_model.layers:
layer.trainable = False
vgg_model.summary()
Model: "model_2"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_3 (InputLayer) [(None, 256, 256, 3)] 0
block1_conv1 (Conv2D) (None, 256, 256, 64) 1792
block1_conv2 (Conv2D) (None, 256, 256, 64) 36928
block1_pool (MaxPooling2D) (None, 128, 128, 64) 0
block2_conv1 (Conv2D) (None, 128, 128, 128) 73856
block2_conv2 (Conv2D) (None, 128, 128, 128) 147584
block2_pool (MaxPooling2D) (None, 64, 64, 128) 0
block3_conv1 (Conv2D) (None, 64, 64, 256) 295168
block3_conv2 (Conv2D) (None, 64, 64, 256) 590080
block3_conv3 (Conv2D) (None, 64, 64, 256) 590080
block3_pool (MaxPooling2D) (None, 32, 32, 256) 0
block4_conv1 (Conv2D) (None, 32, 32, 512) 1180160
block4_conv2 (Conv2D) (None, 32, 32, 512) 2359808
block4_conv3 (Conv2D) (None, 32, 32, 512) 2359808
block4_pool (MaxPooling2D) (None, 16, 16, 512) 0
block5_conv1 (Conv2D) (None, 16, 16, 512) 2359808
block5_conv2 (Conv2D) (None, 16, 16, 512) 2359808
block5_conv3 (Conv2D) (None, 16, 16, 512) 2359808
block5_pool (MaxPooling2D) (None, 8, 8, 512) 0
flatten_2 (Flatten) (None, 32768) 0
=================================================================
Total params: 14,714,688
Trainable params: 0
Non-trainable params: 14,714,688
_________________________________________________________________
from keras.layers import Conv2D, MaxPooling2D, Flatten, Dense, Dropout, InputLayer
from keras.models import Sequential
from keras import optimizers
input_shape=(256,256,3)
vmodel = Sequential()
vmodel.add(vgg_model)
vmodel.add(Dense(512, activation='relu', input_dim=input_shape))
vmodel.add(Dropout(0.3))
vmodel.add(Dense(512, activation='relu'))
vmodel.add(Dropout(0.3))
vmodel.add(Dense(1, activation='sigmoid'))
vmodel.compile(optimizer = 'adam',loss = 'binary_crossentropy',metrics = ['accuracy'])
vmodel.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
model_2 (Functional) (None, 32768) 14714688
dense_6 (Dense) (None, 512) 16777728
dropout_4 (Dropout) (None, 512) 0
dense_7 (Dense) (None, 512) 262656
dropout_5 (Dropout) (None, 512) 0
dense_8 (Dense) (None, 1) 513
=================================================================
Total params: 31,755,585
Trainable params: 17,040,897
Non-trainable params: 14,714,688
_________________________________________________________________
vmodel.fit(train,epochs=20,callbacks=[reduce_lr],steps_per_epoch=100,validation_data=test,class_weight=class_weight)
Epoch 1/20 100/100 [==============================] - 2936s 29s/step - loss: 16.3880 - accuracy: 0.6074 - val_loss: 0.3904 - val_accuracy: 0.8388 - lr: 0.0010 Epoch 2/20 100/100 [==============================] - 1296s 13s/step - loss: 0.7996 - accuracy: 0.7296 - val_loss: 0.4673 - val_accuracy: 0.8051 - lr: 0.0010 Epoch 3/20 100/100 [==============================] - 906s 9s/step - loss: 0.7758 - accuracy: 0.7505 - val_loss: 0.4301 - val_accuracy: 0.7800 - lr: 0.0010 Epoch 4/20 100/100 [==============================] - 788s 8s/step - loss: 0.7732 - accuracy: 0.7509 - val_loss: 0.3988 - val_accuracy: 0.8246 - lr: 0.0010 Epoch 5/20 100/100 [==============================] - 753s 8s/step - loss: 0.7596 - accuracy: 0.7605 - val_loss: 0.3634 - val_accuracy: 0.8497 - lr: 0.0010 Epoch 6/20 100/100 [==============================] - 749s 7s/step - loss: 0.7738 - accuracy: 0.7570 - val_loss: 0.3692 - val_accuracy: 0.8253 - lr: 0.0010 Epoch 7/20 100/100 [==============================] - 758s 8s/step - loss: 0.7565 - accuracy: 0.7628 - val_loss: 0.4262 - val_accuracy: 0.7699 - lr: 0.0010 Epoch 8/20 100/100 [==============================] - 790s 8s/step - loss: 0.7380 - accuracy: 0.7644 - val_loss: 0.4117 - val_accuracy: 0.8190 - lr: 0.0010 Epoch 9/20 100/100 [==============================] - 887s 9s/step - loss: 0.7495 - accuracy: 0.7651 - val_loss: 0.4980 - val_accuracy: 0.7864 - lr: 0.0010 Epoch 10/20 100/100 [==============================] - 935s 9s/step - loss: 0.7441 - accuracy: 0.7643 - val_loss: 0.3974 - val_accuracy: 0.8002 - lr: 0.0010 Epoch 11/20 100/100 [==============================] - ETA: 0s - loss: 0.7558 - accuracy: 0.7639 Epoch 11: ReduceLROnPlateau reducing learning rate to 0.0005000000237487257. 100/100 [==============================] - 942s 9s/step - loss: 0.7558 - accuracy: 0.7639 - val_loss: 0.4567 - val_accuracy: 0.8216 - lr: 0.0010 Epoch 12/20 100/100 [==============================] - 923s 9s/step - loss: 0.7302 - accuracy: 0.7720 - val_loss: 0.3922 - val_accuracy: 0.8298 - lr: 5.0000e-04 Epoch 13/20 100/100 [==============================] - 920s 9s/step - loss: 0.7204 - accuracy: 0.7715 - val_loss: 0.3972 - val_accuracy: 0.8407 - lr: 5.0000e-04 Epoch 14/20 100/100 [==============================] - 939s 9s/step - loss: 0.7076 - accuracy: 0.7766 - val_loss: 0.3971 - val_accuracy: 0.8302 - lr: 5.0000e-04 Epoch 15/20 100/100 [==============================] - 892s 9s/step - loss: 0.6967 - accuracy: 0.7833 - val_loss: 0.4148 - val_accuracy: 0.7939 - lr: 5.0000e-04 Epoch 16/20 100/100 [==============================] - 923s 9s/step - loss: 0.7071 - accuracy: 0.7763 - val_loss: 0.3961 - val_accuracy: 0.8257 - lr: 5.0000e-04 Epoch 17/20 100/100 [==============================] - 901s 9s/step - loss: 0.7064 - accuracy: 0.7840 - val_loss: 0.3932 - val_accuracy: 0.8385 - lr: 5.0000e-04 Epoch 18/20 100/100 [==============================] - 940s 9s/step - loss: 0.6980 - accuracy: 0.7805 - val_loss: 0.3851 - val_accuracy: 0.8298 - lr: 5.0000e-04 Epoch 19/20 100/100 [==============================] - ETA: 0s - loss: 0.6877 - accuracy: 0.7789 Epoch 19: ReduceLROnPlateau reducing learning rate to 0.0002500000118743628. 100/100 [==============================] - 980s 10s/step - loss: 0.6877 - accuracy: 0.7789 - val_loss: 0.4024 - val_accuracy: 0.8130 - lr: 5.0000e-04 Epoch 20/20 100/100 [==============================] - 955s 10s/step - loss: 0.6973 - accuracy: 0.7753 - val_loss: 0.4215 - val_accuracy: 0.8096 - lr: 2.5000e-04
<keras.callbacks.History at 0x7f7519187410>
Saving the Model
vmodel.save(f'/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/vgg16_model.h5')
Plotting accuracy and validation accuracy
plt.figure(figsize=(40,30))
vval_acc=np.asarray(vmodel.history.history['val_accuracy'])*100
vacc=np.asarray(vmodel.history.history['accuracy'])*100
vacc=pd.DataFrame({'val_acc':vval_acc,'acc':vacc})
vacc.plot(figsize=(20,10),yticks=range(50,100,5))
<matplotlib.axes._subplots.AxesSubplot at 0x7f7518da9690>
<Figure size 2880x2160 with 0 Axes>
Plotting Loss and Validation Loss
vloss=vmodel.history.history['loss']
vval_loss=vmodel.history.history['val_loss']
vloss=pd.DataFrame({'val_loss':vval_loss,'loss':vloss})
vloss.plot(figsize=(20,10))
<matplotlib.axes._subplots.AxesSubplot at 0x7f75190cf3d0>
Model Testing
Testing with saved Model
#load the model we saved
vgg16_loaded_model = load_model('/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/vgg16_model.h5')
vgg16_loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
y=[]
test.reset()
for i in tqdm(range(4)):
_,target=test.__getitem__(i)
for j in target:
y.append(j)
100%|██████████| 4/4 [00:02<00:00, 1.68it/s]
test.reset()
y_predv=vgg16_loaded_model.predict(test)
predv=[]
for i in y_predv:
if i[0]>=0.5:
predv.append(1)
else:
predv.append(0)
ROC Curve
from sklearn.metrics import roc_curve,auc,precision_recall_curve,classification_report
print(classification_report(y,predv[:len(y)]))
precision recall f1-score support
0.0 0.93 0.83 0.88 101
1.0 0.55 0.78 0.65 27
accuracy 0.82 128
macro avg 0.74 0.80 0.76 128
weighted avg 0.85 0.82 0.83 128
plt.figure(figsize=(20,10))
fprv,tprv,_=roc_curve(y,y_predv[:len(y)])
area_under_curvev=auc(fprv,tprv)
print('The area under the curve is:',area_under_curvev)
# Plot area under curve
plt.plot(fprv,tprv,'b.-')
plt.xlabel('False positive rate')
plt.ylabel('True positive rate')
plt.plot(fprv,fprv,linestyle='--',color='black')
The area under the curve is: 0.8780711404473781
[<matplotlib.lines.Line2D at 0x7f7422866310>]
from tensorflow.keras.applications.resnet import ResNet50
import keras
resnet_model = ResNet50(input_shape = (256,256,3), include_top = False, weights = 'imagenet')
output = resnet_model.layers[-1].output
output = keras.layers.Flatten()(output)
resnet_model = Model(resnet_model.input, output)
for layer in resnet_model.layers:
layer.trainable = False
resnet_model.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels_notop.h5
94773248/94765736 [==============================] - 1s 0us/step
94781440/94765736 [==============================] - 1s 0us/step
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 256, 256, 3 0 []
)]
conv1_pad (ZeroPadding2D) (None, 262, 262, 3) 0 ['input_1[0][0]']
conv1_conv (Conv2D) (None, 128, 128, 64 9472 ['conv1_pad[0][0]']
)
conv1_bn (BatchNormalization) (None, 128, 128, 64 256 ['conv1_conv[0][0]']
)
conv1_relu (Activation) (None, 128, 128, 64 0 ['conv1_bn[0][0]']
)
pool1_pad (ZeroPadding2D) (None, 130, 130, 64 0 ['conv1_relu[0][0]']
)
pool1_pool (MaxPooling2D) (None, 64, 64, 64) 0 ['pool1_pad[0][0]']
conv2_block1_1_conv (Conv2D) (None, 64, 64, 64) 4160 ['pool1_pool[0][0]']
conv2_block1_1_bn (BatchNormal (None, 64, 64, 64) 256 ['conv2_block1_1_conv[0][0]']
ization)
conv2_block1_1_relu (Activatio (None, 64, 64, 64) 0 ['conv2_block1_1_bn[0][0]']
n)
conv2_block1_2_conv (Conv2D) (None, 64, 64, 64) 36928 ['conv2_block1_1_relu[0][0]']
conv2_block1_2_bn (BatchNormal (None, 64, 64, 64) 256 ['conv2_block1_2_conv[0][0]']
ization)
conv2_block1_2_relu (Activatio (None, 64, 64, 64) 0 ['conv2_block1_2_bn[0][0]']
n)
conv2_block1_0_conv (Conv2D) (None, 64, 64, 256) 16640 ['pool1_pool[0][0]']
conv2_block1_3_conv (Conv2D) (None, 64, 64, 256) 16640 ['conv2_block1_2_relu[0][0]']
conv2_block1_0_bn (BatchNormal (None, 64, 64, 256) 1024 ['conv2_block1_0_conv[0][0]']
ization)
conv2_block1_3_bn (BatchNormal (None, 64, 64, 256) 1024 ['conv2_block1_3_conv[0][0]']
ization)
conv2_block1_add (Add) (None, 64, 64, 256) 0 ['conv2_block1_0_bn[0][0]',
'conv2_block1_3_bn[0][0]']
conv2_block1_out (Activation) (None, 64, 64, 256) 0 ['conv2_block1_add[0][0]']
conv2_block2_1_conv (Conv2D) (None, 64, 64, 64) 16448 ['conv2_block1_out[0][0]']
conv2_block2_1_bn (BatchNormal (None, 64, 64, 64) 256 ['conv2_block2_1_conv[0][0]']
ization)
conv2_block2_1_relu (Activatio (None, 64, 64, 64) 0 ['conv2_block2_1_bn[0][0]']
n)
conv2_block2_2_conv (Conv2D) (None, 64, 64, 64) 36928 ['conv2_block2_1_relu[0][0]']
conv2_block2_2_bn (BatchNormal (None, 64, 64, 64) 256 ['conv2_block2_2_conv[0][0]']
ization)
conv2_block2_2_relu (Activatio (None, 64, 64, 64) 0 ['conv2_block2_2_bn[0][0]']
n)
conv2_block2_3_conv (Conv2D) (None, 64, 64, 256) 16640 ['conv2_block2_2_relu[0][0]']
conv2_block2_3_bn (BatchNormal (None, 64, 64, 256) 1024 ['conv2_block2_3_conv[0][0]']
ization)
conv2_block2_add (Add) (None, 64, 64, 256) 0 ['conv2_block1_out[0][0]',
'conv2_block2_3_bn[0][0]']
conv2_block2_out (Activation) (None, 64, 64, 256) 0 ['conv2_block2_add[0][0]']
conv2_block3_1_conv (Conv2D) (None, 64, 64, 64) 16448 ['conv2_block2_out[0][0]']
conv2_block3_1_bn (BatchNormal (None, 64, 64, 64) 256 ['conv2_block3_1_conv[0][0]']
ization)
conv2_block3_1_relu (Activatio (None, 64, 64, 64) 0 ['conv2_block3_1_bn[0][0]']
n)
conv2_block3_2_conv (Conv2D) (None, 64, 64, 64) 36928 ['conv2_block3_1_relu[0][0]']
conv2_block3_2_bn (BatchNormal (None, 64, 64, 64) 256 ['conv2_block3_2_conv[0][0]']
ization)
conv2_block3_2_relu (Activatio (None, 64, 64, 64) 0 ['conv2_block3_2_bn[0][0]']
n)
conv2_block3_3_conv (Conv2D) (None, 64, 64, 256) 16640 ['conv2_block3_2_relu[0][0]']
conv2_block3_3_bn (BatchNormal (None, 64, 64, 256) 1024 ['conv2_block3_3_conv[0][0]']
ization)
conv2_block3_add (Add) (None, 64, 64, 256) 0 ['conv2_block2_out[0][0]',
'conv2_block3_3_bn[0][0]']
conv2_block3_out (Activation) (None, 64, 64, 256) 0 ['conv2_block3_add[0][0]']
conv3_block1_1_conv (Conv2D) (None, 32, 32, 128) 32896 ['conv2_block3_out[0][0]']
conv3_block1_1_bn (BatchNormal (None, 32, 32, 128) 512 ['conv3_block1_1_conv[0][0]']
ization)
conv3_block1_1_relu (Activatio (None, 32, 32, 128) 0 ['conv3_block1_1_bn[0][0]']
n)
conv3_block1_2_conv (Conv2D) (None, 32, 32, 128) 147584 ['conv3_block1_1_relu[0][0]']
conv3_block1_2_bn (BatchNormal (None, 32, 32, 128) 512 ['conv3_block1_2_conv[0][0]']
ization)
conv3_block1_2_relu (Activatio (None, 32, 32, 128) 0 ['conv3_block1_2_bn[0][0]']
n)
conv3_block1_0_conv (Conv2D) (None, 32, 32, 512) 131584 ['conv2_block3_out[0][0]']
conv3_block1_3_conv (Conv2D) (None, 32, 32, 512) 66048 ['conv3_block1_2_relu[0][0]']
conv3_block1_0_bn (BatchNormal (None, 32, 32, 512) 2048 ['conv3_block1_0_conv[0][0]']
ization)
conv3_block1_3_bn (BatchNormal (None, 32, 32, 512) 2048 ['conv3_block1_3_conv[0][0]']
ization)
conv3_block1_add (Add) (None, 32, 32, 512) 0 ['conv3_block1_0_bn[0][0]',
'conv3_block1_3_bn[0][0]']
conv3_block1_out (Activation) (None, 32, 32, 512) 0 ['conv3_block1_add[0][0]']
conv3_block2_1_conv (Conv2D) (None, 32, 32, 128) 65664 ['conv3_block1_out[0][0]']
conv3_block2_1_bn (BatchNormal (None, 32, 32, 128) 512 ['conv3_block2_1_conv[0][0]']
ization)
conv3_block2_1_relu (Activatio (None, 32, 32, 128) 0 ['conv3_block2_1_bn[0][0]']
n)
conv3_block2_2_conv (Conv2D) (None, 32, 32, 128) 147584 ['conv3_block2_1_relu[0][0]']
conv3_block2_2_bn (BatchNormal (None, 32, 32, 128) 512 ['conv3_block2_2_conv[0][0]']
ization)
conv3_block2_2_relu (Activatio (None, 32, 32, 128) 0 ['conv3_block2_2_bn[0][0]']
n)
conv3_block2_3_conv (Conv2D) (None, 32, 32, 512) 66048 ['conv3_block2_2_relu[0][0]']
conv3_block2_3_bn (BatchNormal (None, 32, 32, 512) 2048 ['conv3_block2_3_conv[0][0]']
ization)
conv3_block2_add (Add) (None, 32, 32, 512) 0 ['conv3_block1_out[0][0]',
'conv3_block2_3_bn[0][0]']
conv3_block2_out (Activation) (None, 32, 32, 512) 0 ['conv3_block2_add[0][0]']
conv3_block3_1_conv (Conv2D) (None, 32, 32, 128) 65664 ['conv3_block2_out[0][0]']
conv3_block3_1_bn (BatchNormal (None, 32, 32, 128) 512 ['conv3_block3_1_conv[0][0]']
ization)
conv3_block3_1_relu (Activatio (None, 32, 32, 128) 0 ['conv3_block3_1_bn[0][0]']
n)
conv3_block3_2_conv (Conv2D) (None, 32, 32, 128) 147584 ['conv3_block3_1_relu[0][0]']
conv3_block3_2_bn (BatchNormal (None, 32, 32, 128) 512 ['conv3_block3_2_conv[0][0]']
ization)
conv3_block3_2_relu (Activatio (None, 32, 32, 128) 0 ['conv3_block3_2_bn[0][0]']
n)
conv3_block3_3_conv (Conv2D) (None, 32, 32, 512) 66048 ['conv3_block3_2_relu[0][0]']
conv3_block3_3_bn (BatchNormal (None, 32, 32, 512) 2048 ['conv3_block3_3_conv[0][0]']
ization)
conv3_block3_add (Add) (None, 32, 32, 512) 0 ['conv3_block2_out[0][0]',
'conv3_block3_3_bn[0][0]']
conv3_block3_out (Activation) (None, 32, 32, 512) 0 ['conv3_block3_add[0][0]']
conv3_block4_1_conv (Conv2D) (None, 32, 32, 128) 65664 ['conv3_block3_out[0][0]']
conv3_block4_1_bn (BatchNormal (None, 32, 32, 128) 512 ['conv3_block4_1_conv[0][0]']
ization)
conv3_block4_1_relu (Activatio (None, 32, 32, 128) 0 ['conv3_block4_1_bn[0][0]']
n)
conv3_block4_2_conv (Conv2D) (None, 32, 32, 128) 147584 ['conv3_block4_1_relu[0][0]']
conv3_block4_2_bn (BatchNormal (None, 32, 32, 128) 512 ['conv3_block4_2_conv[0][0]']
ization)
conv3_block4_2_relu (Activatio (None, 32, 32, 128) 0 ['conv3_block4_2_bn[0][0]']
n)
conv3_block4_3_conv (Conv2D) (None, 32, 32, 512) 66048 ['conv3_block4_2_relu[0][0]']
conv3_block4_3_bn (BatchNormal (None, 32, 32, 512) 2048 ['conv3_block4_3_conv[0][0]']
ization)
conv3_block4_add (Add) (None, 32, 32, 512) 0 ['conv3_block3_out[0][0]',
'conv3_block4_3_bn[0][0]']
conv3_block4_out (Activation) (None, 32, 32, 512) 0 ['conv3_block4_add[0][0]']
conv4_block1_1_conv (Conv2D) (None, 16, 16, 256) 131328 ['conv3_block4_out[0][0]']
conv4_block1_1_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block1_1_conv[0][0]']
ization)
conv4_block1_1_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block1_1_bn[0][0]']
n)
conv4_block1_2_conv (Conv2D) (None, 16, 16, 256) 590080 ['conv4_block1_1_relu[0][0]']
conv4_block1_2_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block1_2_conv[0][0]']
ization)
conv4_block1_2_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block1_2_bn[0][0]']
n)
conv4_block1_0_conv (Conv2D) (None, 16, 16, 1024 525312 ['conv3_block4_out[0][0]']
)
conv4_block1_3_conv (Conv2D) (None, 16, 16, 1024 263168 ['conv4_block1_2_relu[0][0]']
)
conv4_block1_0_bn (BatchNormal (None, 16, 16, 1024 4096 ['conv4_block1_0_conv[0][0]']
ization) )
conv4_block1_3_bn (BatchNormal (None, 16, 16, 1024 4096 ['conv4_block1_3_conv[0][0]']
ization) )
conv4_block1_add (Add) (None, 16, 16, 1024 0 ['conv4_block1_0_bn[0][0]',
) 'conv4_block1_3_bn[0][0]']
conv4_block1_out (Activation) (None, 16, 16, 1024 0 ['conv4_block1_add[0][0]']
)
conv4_block2_1_conv (Conv2D) (None, 16, 16, 256) 262400 ['conv4_block1_out[0][0]']
conv4_block2_1_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block2_1_conv[0][0]']
ization)
conv4_block2_1_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block2_1_bn[0][0]']
n)
conv4_block2_2_conv (Conv2D) (None, 16, 16, 256) 590080 ['conv4_block2_1_relu[0][0]']
conv4_block2_2_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block2_2_conv[0][0]']
ization)
conv4_block2_2_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block2_2_bn[0][0]']
n)
conv4_block2_3_conv (Conv2D) (None, 16, 16, 1024 263168 ['conv4_block2_2_relu[0][0]']
)
conv4_block2_3_bn (BatchNormal (None, 16, 16, 1024 4096 ['conv4_block2_3_conv[0][0]']
ization) )
conv4_block2_add (Add) (None, 16, 16, 1024 0 ['conv4_block1_out[0][0]',
) 'conv4_block2_3_bn[0][0]']
conv4_block2_out (Activation) (None, 16, 16, 1024 0 ['conv4_block2_add[0][0]']
)
conv4_block3_1_conv (Conv2D) (None, 16, 16, 256) 262400 ['conv4_block2_out[0][0]']
conv4_block3_1_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block3_1_conv[0][0]']
ization)
conv4_block3_1_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block3_1_bn[0][0]']
n)
conv4_block3_2_conv (Conv2D) (None, 16, 16, 256) 590080 ['conv4_block3_1_relu[0][0]']
conv4_block3_2_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block3_2_conv[0][0]']
ization)
conv4_block3_2_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block3_2_bn[0][0]']
n)
conv4_block3_3_conv (Conv2D) (None, 16, 16, 1024 263168 ['conv4_block3_2_relu[0][0]']
)
conv4_block3_3_bn (BatchNormal (None, 16, 16, 1024 4096 ['conv4_block3_3_conv[0][0]']
ization) )
conv4_block3_add (Add) (None, 16, 16, 1024 0 ['conv4_block2_out[0][0]',
) 'conv4_block3_3_bn[0][0]']
conv4_block3_out (Activation) (None, 16, 16, 1024 0 ['conv4_block3_add[0][0]']
)
conv4_block4_1_conv (Conv2D) (None, 16, 16, 256) 262400 ['conv4_block3_out[0][0]']
conv4_block4_1_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block4_1_conv[0][0]']
ization)
conv4_block4_1_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block4_1_bn[0][0]']
n)
conv4_block4_2_conv (Conv2D) (None, 16, 16, 256) 590080 ['conv4_block4_1_relu[0][0]']
conv4_block4_2_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block4_2_conv[0][0]']
ization)
conv4_block4_2_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block4_2_bn[0][0]']
n)
conv4_block4_3_conv (Conv2D) (None, 16, 16, 1024 263168 ['conv4_block4_2_relu[0][0]']
)
conv4_block4_3_bn (BatchNormal (None, 16, 16, 1024 4096 ['conv4_block4_3_conv[0][0]']
ization) )
conv4_block4_add (Add) (None, 16, 16, 1024 0 ['conv4_block3_out[0][0]',
) 'conv4_block4_3_bn[0][0]']
conv4_block4_out (Activation) (None, 16, 16, 1024 0 ['conv4_block4_add[0][0]']
)
conv4_block5_1_conv (Conv2D) (None, 16, 16, 256) 262400 ['conv4_block4_out[0][0]']
conv4_block5_1_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block5_1_conv[0][0]']
ization)
conv4_block5_1_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block5_1_bn[0][0]']
n)
conv4_block5_2_conv (Conv2D) (None, 16, 16, 256) 590080 ['conv4_block5_1_relu[0][0]']
conv4_block5_2_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block5_2_conv[0][0]']
ization)
conv4_block5_2_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block5_2_bn[0][0]']
n)
conv4_block5_3_conv (Conv2D) (None, 16, 16, 1024 263168 ['conv4_block5_2_relu[0][0]']
)
conv4_block5_3_bn (BatchNormal (None, 16, 16, 1024 4096 ['conv4_block5_3_conv[0][0]']
ization) )
conv4_block5_add (Add) (None, 16, 16, 1024 0 ['conv4_block4_out[0][0]',
) 'conv4_block5_3_bn[0][0]']
conv4_block5_out (Activation) (None, 16, 16, 1024 0 ['conv4_block5_add[0][0]']
)
conv4_block6_1_conv (Conv2D) (None, 16, 16, 256) 262400 ['conv4_block5_out[0][0]']
conv4_block6_1_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block6_1_conv[0][0]']
ization)
conv4_block6_1_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block6_1_bn[0][0]']
n)
conv4_block6_2_conv (Conv2D) (None, 16, 16, 256) 590080 ['conv4_block6_1_relu[0][0]']
conv4_block6_2_bn (BatchNormal (None, 16, 16, 256) 1024 ['conv4_block6_2_conv[0][0]']
ization)
conv4_block6_2_relu (Activatio (None, 16, 16, 256) 0 ['conv4_block6_2_bn[0][0]']
n)
conv4_block6_3_conv (Conv2D) (None, 16, 16, 1024 263168 ['conv4_block6_2_relu[0][0]']
)
conv4_block6_3_bn (BatchNormal (None, 16, 16, 1024 4096 ['conv4_block6_3_conv[0][0]']
ization) )
conv4_block6_add (Add) (None, 16, 16, 1024 0 ['conv4_block5_out[0][0]',
) 'conv4_block6_3_bn[0][0]']
conv4_block6_out (Activation) (None, 16, 16, 1024 0 ['conv4_block6_add[0][0]']
)
conv5_block1_1_conv (Conv2D) (None, 8, 8, 512) 524800 ['conv4_block6_out[0][0]']
conv5_block1_1_bn (BatchNormal (None, 8, 8, 512) 2048 ['conv5_block1_1_conv[0][0]']
ization)
conv5_block1_1_relu (Activatio (None, 8, 8, 512) 0 ['conv5_block1_1_bn[0][0]']
n)
conv5_block1_2_conv (Conv2D) (None, 8, 8, 512) 2359808 ['conv5_block1_1_relu[0][0]']
conv5_block1_2_bn (BatchNormal (None, 8, 8, 512) 2048 ['conv5_block1_2_conv[0][0]']
ization)
conv5_block1_2_relu (Activatio (None, 8, 8, 512) 0 ['conv5_block1_2_bn[0][0]']
n)
conv5_block1_0_conv (Conv2D) (None, 8, 8, 2048) 2099200 ['conv4_block6_out[0][0]']
conv5_block1_3_conv (Conv2D) (None, 8, 8, 2048) 1050624 ['conv5_block1_2_relu[0][0]']
conv5_block1_0_bn (BatchNormal (None, 8, 8, 2048) 8192 ['conv5_block1_0_conv[0][0]']
ization)
conv5_block1_3_bn (BatchNormal (None, 8, 8, 2048) 8192 ['conv5_block1_3_conv[0][0]']
ization)
conv5_block1_add (Add) (None, 8, 8, 2048) 0 ['conv5_block1_0_bn[0][0]',
'conv5_block1_3_bn[0][0]']
conv5_block1_out (Activation) (None, 8, 8, 2048) 0 ['conv5_block1_add[0][0]']
conv5_block2_1_conv (Conv2D) (None, 8, 8, 512) 1049088 ['conv5_block1_out[0][0]']
conv5_block2_1_bn (BatchNormal (None, 8, 8, 512) 2048 ['conv5_block2_1_conv[0][0]']
ization)
conv5_block2_1_relu (Activatio (None, 8, 8, 512) 0 ['conv5_block2_1_bn[0][0]']
n)
conv5_block2_2_conv (Conv2D) (None, 8, 8, 512) 2359808 ['conv5_block2_1_relu[0][0]']
conv5_block2_2_bn (BatchNormal (None, 8, 8, 512) 2048 ['conv5_block2_2_conv[0][0]']
ization)
conv5_block2_2_relu (Activatio (None, 8, 8, 512) 0 ['conv5_block2_2_bn[0][0]']
n)
conv5_block2_3_conv (Conv2D) (None, 8, 8, 2048) 1050624 ['conv5_block2_2_relu[0][0]']
conv5_block2_3_bn (BatchNormal (None, 8, 8, 2048) 8192 ['conv5_block2_3_conv[0][0]']
ization)
conv5_block2_add (Add) (None, 8, 8, 2048) 0 ['conv5_block1_out[0][0]',
'conv5_block2_3_bn[0][0]']
conv5_block2_out (Activation) (None, 8, 8, 2048) 0 ['conv5_block2_add[0][0]']
conv5_block3_1_conv (Conv2D) (None, 8, 8, 512) 1049088 ['conv5_block2_out[0][0]']
conv5_block3_1_bn (BatchNormal (None, 8, 8, 512) 2048 ['conv5_block3_1_conv[0][0]']
ization)
conv5_block3_1_relu (Activatio (None, 8, 8, 512) 0 ['conv5_block3_1_bn[0][0]']
n)
conv5_block3_2_conv (Conv2D) (None, 8, 8, 512) 2359808 ['conv5_block3_1_relu[0][0]']
conv5_block3_2_bn (BatchNormal (None, 8, 8, 512) 2048 ['conv5_block3_2_conv[0][0]']
ization)
conv5_block3_2_relu (Activatio (None, 8, 8, 512) 0 ['conv5_block3_2_bn[0][0]']
n)
conv5_block3_3_conv (Conv2D) (None, 8, 8, 2048) 1050624 ['conv5_block3_2_relu[0][0]']
conv5_block3_3_bn (BatchNormal (None, 8, 8, 2048) 8192 ['conv5_block3_3_conv[0][0]']
ization)
conv5_block3_add (Add) (None, 8, 8, 2048) 0 ['conv5_block2_out[0][0]',
'conv5_block3_3_bn[0][0]']
conv5_block3_out (Activation) (None, 8, 8, 2048) 0 ['conv5_block3_add[0][0]']
flatten (Flatten) (None, 131072) 0 ['conv5_block3_out[0][0]']
==================================================================================================
Total params: 23,587,712
Trainable params: 0
Non-trainable params: 23,587,712
__________________________________________________________________________________________________
Model Compilation
from keras.layers import Conv2D, MaxPooling2D, Flatten, Dense, Dropout, InputLayer
from keras.models import Sequential
from keras import optimizers
input_shape=(256,256,3)
rmodel = Sequential()
rmodel.add(resnet_model)
rmodel.add(Dense(512, activation='relu', input_dim=input_shape))
rmodel.add(Dropout(0.3))
rmodel.add(Dense(512, activation='relu'))
rmodel.add(Dropout(0.3))
rmodel.add(Dense(1, activation='sigmoid'))
rmodel.compile(optimizer = 'adam',
loss = 'binary_crossentropy',
metrics = ['accuracy'])
rmodel.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
model (Functional) (None, 131072) 23587712
dense (Dense) (None, 512) 67109376
dropout (Dropout) (None, 512) 0
dense_1 (Dense) (None, 512) 262656
dropout_1 (Dropout) (None, 512) 0
dense_2 (Dense) (None, 1) 513
=================================================================
Total params: 90,960,257
Trainable params: 67,372,545
Non-trainable params: 23,587,712
_________________________________________________________________
Model Training
rmodel.fit(train,epochs=20,callbacks=[reduce_lr],steps_per_epoch=100,validation_data=test,class_weight=class_weight)
Epoch 1/20 100/100 [==============================] - 2576s 26s/step - loss: 3.2309 - accuracy: 0.6114 - val_loss: 0.6875 - val_accuracy: 0.6810 - lr: 0.0010 Epoch 2/20 100/100 [==============================] - 1001s 10s/step - loss: 0.9994 - accuracy: 0.6999 - val_loss: 0.6449 - val_accuracy: 0.7365 - lr: 0.0010 Epoch 3/20 100/100 [==============================] - 656s 7s/step - loss: 0.9664 - accuracy: 0.7345 - val_loss: 0.6163 - val_accuracy: 0.7373 - lr: 0.0010 Epoch 4/20 100/100 [==============================] - 505s 5s/step - loss: 0.9651 - accuracy: 0.7441 - val_loss: 0.6189 - val_accuracy: 0.7279 - lr: 0.0010 Epoch 5/20 100/100 [==============================] - 455s 5s/step - loss: 0.9781 - accuracy: 0.7442 - val_loss: 0.5853 - val_accuracy: 0.7657 - lr: 0.0010 Epoch 6/20 100/100 [==============================] - 477s 5s/step - loss: 1.0410 - accuracy: 0.7598 - val_loss: 0.6596 - val_accuracy: 0.7744 - lr: 0.0010 Epoch 7/20 100/100 [==============================] - 491s 5s/step - loss: 1.0343 - accuracy: 0.7505 - val_loss: 0.6243 - val_accuracy: 0.8216 - lr: 0.0010 Epoch 8/20 100/100 [==============================] - ETA: 0s - loss: 0.9717 - accuracy: 0.7553 Epoch 8: ReduceLROnPlateau reducing learning rate to 0.0005000000237487257. 100/100 [==============================] - 500s 5s/step - loss: 0.9717 - accuracy: 0.7553 - val_loss: 0.6012 - val_accuracy: 0.7740 - lr: 0.0010 Epoch 9/20 100/100 [==============================] - 495s 5s/step - loss: 0.9296 - accuracy: 0.7448 - val_loss: 0.5823 - val_accuracy: 0.7384 - lr: 5.0000e-04 Epoch 10/20 100/100 [==============================] - ETA: 0s - loss: 0.9344 - accuracy: 0.7349 Epoch 10: ReduceLROnPlateau reducing learning rate to 0.0002500000118743628. 100/100 [==============================] - 484s 5s/step - loss: 0.9344 - accuracy: 0.7349 - val_loss: 0.5744 - val_accuracy: 0.7395 - lr: 5.0000e-04 Epoch 11/20 100/100 [==============================] - 488s 5s/step - loss: 0.9089 - accuracy: 0.7446 - val_loss: 0.5318 - val_accuracy: 0.7991 - lr: 2.5000e-04 Epoch 12/20 100/100 [==============================] - ETA: 0s - loss: 0.9009 - accuracy: 0.7487 Epoch 12: ReduceLROnPlateau reducing learning rate to 0.0001250000059371814. 100/100 [==============================] - 481s 5s/step - loss: 0.9009 - accuracy: 0.7487 - val_loss: 0.5237 - val_accuracy: 0.7984 - lr: 2.5000e-04 Epoch 13/20 100/100 [==============================] - 486s 5s/step - loss: 0.8929 - accuracy: 0.7501 - val_loss: 0.5282 - val_accuracy: 0.7875 - lr: 1.2500e-04 Epoch 14/20 100/100 [==============================] - ETA: 0s - loss: 0.9004 - accuracy: 0.7302 Epoch 14: ReduceLROnPlateau reducing learning rate to 6.25000029685907e-05. 100/100 [==============================] - 466s 5s/step - loss: 0.9004 - accuracy: 0.7302 - val_loss: 0.5406 - val_accuracy: 0.7627 - lr: 1.2500e-04 Epoch 15/20 100/100 [==============================] - 472s 5s/step - loss: 0.8936 - accuracy: 0.7411 - val_loss: 0.5157 - val_accuracy: 0.8051 - lr: 6.2500e-05 Epoch 16/20 100/100 [==============================] - ETA: 0s - loss: 0.8904 - accuracy: 0.7364 Epoch 16: ReduceLROnPlateau reducing learning rate to 3.125000148429535e-05. 100/100 [==============================] - 486s 5s/step - loss: 0.8904 - accuracy: 0.7364 - val_loss: 0.5152 - val_accuracy: 0.8032 - lr: 6.2500e-05 Epoch 17/20 100/100 [==============================] - 439s 4s/step - loss: 0.8805 - accuracy: 0.7441 - val_loss: 0.5404 - val_accuracy: 0.7567 - lr: 3.1250e-05 Epoch 18/20 100/100 [==============================] - ETA: 0s - loss: 0.8785 - accuracy: 0.7367 Epoch 18: ReduceLROnPlateau reducing learning rate to 1.5625000742147677e-05. 100/100 [==============================] - 439s 4s/step - loss: 0.8785 - accuracy: 0.7367 - val_loss: 0.5257 - val_accuracy: 0.7822 - lr: 3.1250e-05 Epoch 19/20 100/100 [==============================] - 468s 5s/step - loss: 0.8772 - accuracy: 0.7422 - val_loss: 0.5150 - val_accuracy: 0.7916 - lr: 1.5625e-05 Epoch 20/20 100/100 [==============================] - ETA: 0s - loss: 0.8738 - accuracy: 0.7416 Epoch 20: ReduceLROnPlateau reducing learning rate to 7.812500371073838e-06. 100/100 [==============================] - 469s 5s/step - loss: 0.8738 - accuracy: 0.7416 - val_loss: 0.5265 - val_accuracy: 0.7755 - lr: 1.5625e-05
<keras.callbacks.History at 0x7f2045333850>
Saving the Model
rmodel.save(f'/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/resnet50_model.h5')
Plotting & Validation Accuracy
plt.figure(figsize=(30,20))
rval_acc=np.asarray(rmodel.history.history['val_accuracy'])*100
racc=np.asarray(rmodel.history.history['accuracy'])*100
racc=pd.DataFrame({'val_acc':rval_acc,'acc':racc})
racc.plot(figsize=(20,10),yticks=range(50,100,5))
<matplotlib.axes._subplots.AxesSubplot at 0x7f1f50931bd0>
<Figure size 2160x1440 with 0 Axes>
Plotting & Validation Loss
rloss=rmodel.history.history['loss']
rval_loss=rmodel.history.history['val_loss']
rloss=pd.DataFrame({'val_loss':rval_loss,'loss':rloss})
rloss.plot(figsize=(20,10))
<matplotlib.axes._subplots.AxesSubplot at 0x7f1f5096cc10>
Model Testing
Testing with saved Model
#load the model we saved
resnet50_loaded_model = load_model('/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/resnet50_model.h5')
resnet50_loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
y=[]
test.reset()
for i in tqdm(range(4)):
_,target=test.__getitem__(i)
for j in target:
y.append(j)
100%|██████████| 4/4 [00:02<00:00, 1.96it/s]
test.reset()
y_predr=resnet50_loaded_model.predict(test)
predr=[]
for i in y_predr:
if i[0]>=0.5:
predr.append(1)
else:
predr.append(0)
Classification Report and ROC Curve
print(classification_report(y,predr[:len(y)]))
precision recall f1-score support
0.0 0.88 0.73 0.80 94
1.0 0.50 0.74 0.60 34
accuracy 0.73 128
macro avg 0.69 0.73 0.70 128
weighted avg 0.78 0.73 0.75 128
plt.figure(figsize=(20,10))
fprr,tprr,_=roc_curve(y,y_predr[:len(y)])
area_under_curver=auc(fprr,tprr)
print('The area under the curve is:',area_under_curver)
# Plot area under curve
plt.plot(fprr,tprr,'b.-')
plt.xlabel('False positive rate')
plt.ylabel('True positive rate')
plt.plot(fprr,fprr,linestyle='--',color='black')
The area under the curve is: 0.8097622027534417
[<matplotlib.lines.Line2D at 0x7f1f50197990>]
from tensorflow.keras.applications.inception_v3 import InceptionV3
inception_model = InceptionV3(input_shape = (256,256,3),
include_top = False,
weights = 'imagenet')
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/inception_v3/inception_v3_weights_tf_dim_ordering_tf_kernels_notop.h5 87916544/87910968 [==============================] - 0s 0us/step 87924736/87910968 [==============================] - 0s 0us/step
import keras
import keras.utils
from keras import utils as np_utils
output = inception_model.layers[-1].output
output = keras.layers.Flatten()(output)
inception_model = Model(inception_model.input, output)
for layer in inception_model.layers:
layer.trainable = False
inception_model.summary()
Model: "model_1"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) [(None, 256, 256, 3 0 []
)]
conv2d (Conv2D) (None, 127, 127, 32 864 ['input_2[0][0]']
)
batch_normalization (BatchNorm (None, 127, 127, 32 96 ['conv2d[0][0]']
alization) )
activation (Activation) (None, 127, 127, 32 0 ['batch_normalization[0][0]']
)
conv2d_1 (Conv2D) (None, 125, 125, 32 9216 ['activation[0][0]']
)
batch_normalization_1 (BatchNo (None, 125, 125, 32 96 ['conv2d_1[0][0]']
rmalization) )
activation_1 (Activation) (None, 125, 125, 32 0 ['batch_normalization_1[0][0]']
)
conv2d_2 (Conv2D) (None, 125, 125, 64 18432 ['activation_1[0][0]']
)
batch_normalization_2 (BatchNo (None, 125, 125, 64 192 ['conv2d_2[0][0]']
rmalization) )
activation_2 (Activation) (None, 125, 125, 64 0 ['batch_normalization_2[0][0]']
)
max_pooling2d (MaxPooling2D) (None, 62, 62, 64) 0 ['activation_2[0][0]']
conv2d_3 (Conv2D) (None, 62, 62, 80) 5120 ['max_pooling2d[0][0]']
batch_normalization_3 (BatchNo (None, 62, 62, 80) 240 ['conv2d_3[0][0]']
rmalization)
activation_3 (Activation) (None, 62, 62, 80) 0 ['batch_normalization_3[0][0]']
conv2d_4 (Conv2D) (None, 60, 60, 192) 138240 ['activation_3[0][0]']
batch_normalization_4 (BatchNo (None, 60, 60, 192) 576 ['conv2d_4[0][0]']
rmalization)
activation_4 (Activation) (None, 60, 60, 192) 0 ['batch_normalization_4[0][0]']
max_pooling2d_1 (MaxPooling2D) (None, 29, 29, 192) 0 ['activation_4[0][0]']
conv2d_8 (Conv2D) (None, 29, 29, 64) 12288 ['max_pooling2d_1[0][0]']
batch_normalization_8 (BatchNo (None, 29, 29, 64) 192 ['conv2d_8[0][0]']
rmalization)
activation_8 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_8[0][0]']
conv2d_6 (Conv2D) (None, 29, 29, 48) 9216 ['max_pooling2d_1[0][0]']
conv2d_9 (Conv2D) (None, 29, 29, 96) 55296 ['activation_8[0][0]']
batch_normalization_6 (BatchNo (None, 29, 29, 48) 144 ['conv2d_6[0][0]']
rmalization)
batch_normalization_9 (BatchNo (None, 29, 29, 96) 288 ['conv2d_9[0][0]']
rmalization)
activation_6 (Activation) (None, 29, 29, 48) 0 ['batch_normalization_6[0][0]']
activation_9 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_9[0][0]']
average_pooling2d (AveragePool (None, 29, 29, 192) 0 ['max_pooling2d_1[0][0]']
ing2D)
conv2d_5 (Conv2D) (None, 29, 29, 64) 12288 ['max_pooling2d_1[0][0]']
conv2d_7 (Conv2D) (None, 29, 29, 64) 76800 ['activation_6[0][0]']
conv2d_10 (Conv2D) (None, 29, 29, 96) 82944 ['activation_9[0][0]']
conv2d_11 (Conv2D) (None, 29, 29, 32) 6144 ['average_pooling2d[0][0]']
batch_normalization_5 (BatchNo (None, 29, 29, 64) 192 ['conv2d_5[0][0]']
rmalization)
batch_normalization_7 (BatchNo (None, 29, 29, 64) 192 ['conv2d_7[0][0]']
rmalization)
batch_normalization_10 (BatchN (None, 29, 29, 96) 288 ['conv2d_10[0][0]']
ormalization)
batch_normalization_11 (BatchN (None, 29, 29, 32) 96 ['conv2d_11[0][0]']
ormalization)
activation_5 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_5[0][0]']
activation_7 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_7[0][0]']
activation_10 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_10[0][0]']
activation_11 (Activation) (None, 29, 29, 32) 0 ['batch_normalization_11[0][0]']
mixed0 (Concatenate) (None, 29, 29, 256) 0 ['activation_5[0][0]',
'activation_7[0][0]',
'activation_10[0][0]',
'activation_11[0][0]']
conv2d_15 (Conv2D) (None, 29, 29, 64) 16384 ['mixed0[0][0]']
batch_normalization_15 (BatchN (None, 29, 29, 64) 192 ['conv2d_15[0][0]']
ormalization)
activation_15 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_15[0][0]']
conv2d_13 (Conv2D) (None, 29, 29, 48) 12288 ['mixed0[0][0]']
conv2d_16 (Conv2D) (None, 29, 29, 96) 55296 ['activation_15[0][0]']
batch_normalization_13 (BatchN (None, 29, 29, 48) 144 ['conv2d_13[0][0]']
ormalization)
batch_normalization_16 (BatchN (None, 29, 29, 96) 288 ['conv2d_16[0][0]']
ormalization)
activation_13 (Activation) (None, 29, 29, 48) 0 ['batch_normalization_13[0][0]']
activation_16 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_16[0][0]']
average_pooling2d_1 (AveragePo (None, 29, 29, 256) 0 ['mixed0[0][0]']
oling2D)
conv2d_12 (Conv2D) (None, 29, 29, 64) 16384 ['mixed0[0][0]']
conv2d_14 (Conv2D) (None, 29, 29, 64) 76800 ['activation_13[0][0]']
conv2d_17 (Conv2D) (None, 29, 29, 96) 82944 ['activation_16[0][0]']
conv2d_18 (Conv2D) (None, 29, 29, 64) 16384 ['average_pooling2d_1[0][0]']
batch_normalization_12 (BatchN (None, 29, 29, 64) 192 ['conv2d_12[0][0]']
ormalization)
batch_normalization_14 (BatchN (None, 29, 29, 64) 192 ['conv2d_14[0][0]']
ormalization)
batch_normalization_17 (BatchN (None, 29, 29, 96) 288 ['conv2d_17[0][0]']
ormalization)
batch_normalization_18 (BatchN (None, 29, 29, 64) 192 ['conv2d_18[0][0]']
ormalization)
activation_12 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_12[0][0]']
activation_14 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_14[0][0]']
activation_17 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_17[0][0]']
activation_18 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_18[0][0]']
mixed1 (Concatenate) (None, 29, 29, 288) 0 ['activation_12[0][0]',
'activation_14[0][0]',
'activation_17[0][0]',
'activation_18[0][0]']
conv2d_22 (Conv2D) (None, 29, 29, 64) 18432 ['mixed1[0][0]']
batch_normalization_22 (BatchN (None, 29, 29, 64) 192 ['conv2d_22[0][0]']
ormalization)
activation_22 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_22[0][0]']
conv2d_20 (Conv2D) (None, 29, 29, 48) 13824 ['mixed1[0][0]']
conv2d_23 (Conv2D) (None, 29, 29, 96) 55296 ['activation_22[0][0]']
batch_normalization_20 (BatchN (None, 29, 29, 48) 144 ['conv2d_20[0][0]']
ormalization)
batch_normalization_23 (BatchN (None, 29, 29, 96) 288 ['conv2d_23[0][0]']
ormalization)
activation_20 (Activation) (None, 29, 29, 48) 0 ['batch_normalization_20[0][0]']
activation_23 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_23[0][0]']
average_pooling2d_2 (AveragePo (None, 29, 29, 288) 0 ['mixed1[0][0]']
oling2D)
conv2d_19 (Conv2D) (None, 29, 29, 64) 18432 ['mixed1[0][0]']
conv2d_21 (Conv2D) (None, 29, 29, 64) 76800 ['activation_20[0][0]']
conv2d_24 (Conv2D) (None, 29, 29, 96) 82944 ['activation_23[0][0]']
conv2d_25 (Conv2D) (None, 29, 29, 64) 18432 ['average_pooling2d_2[0][0]']
batch_normalization_19 (BatchN (None, 29, 29, 64) 192 ['conv2d_19[0][0]']
ormalization)
batch_normalization_21 (BatchN (None, 29, 29, 64) 192 ['conv2d_21[0][0]']
ormalization)
batch_normalization_24 (BatchN (None, 29, 29, 96) 288 ['conv2d_24[0][0]']
ormalization)
batch_normalization_25 (BatchN (None, 29, 29, 64) 192 ['conv2d_25[0][0]']
ormalization)
activation_19 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_19[0][0]']
activation_21 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_21[0][0]']
activation_24 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_24[0][0]']
activation_25 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_25[0][0]']
mixed2 (Concatenate) (None, 29, 29, 288) 0 ['activation_19[0][0]',
'activation_21[0][0]',
'activation_24[0][0]',
'activation_25[0][0]']
conv2d_27 (Conv2D) (None, 29, 29, 64) 18432 ['mixed2[0][0]']
batch_normalization_27 (BatchN (None, 29, 29, 64) 192 ['conv2d_27[0][0]']
ormalization)
activation_27 (Activation) (None, 29, 29, 64) 0 ['batch_normalization_27[0][0]']
conv2d_28 (Conv2D) (None, 29, 29, 96) 55296 ['activation_27[0][0]']
batch_normalization_28 (BatchN (None, 29, 29, 96) 288 ['conv2d_28[0][0]']
ormalization)
activation_28 (Activation) (None, 29, 29, 96) 0 ['batch_normalization_28[0][0]']
conv2d_26 (Conv2D) (None, 14, 14, 384) 995328 ['mixed2[0][0]']
conv2d_29 (Conv2D) (None, 14, 14, 96) 82944 ['activation_28[0][0]']
batch_normalization_26 (BatchN (None, 14, 14, 384) 1152 ['conv2d_26[0][0]']
ormalization)
batch_normalization_29 (BatchN (None, 14, 14, 96) 288 ['conv2d_29[0][0]']
ormalization)
activation_26 (Activation) (None, 14, 14, 384) 0 ['batch_normalization_26[0][0]']
activation_29 (Activation) (None, 14, 14, 96) 0 ['batch_normalization_29[0][0]']
max_pooling2d_2 (MaxPooling2D) (None, 14, 14, 288) 0 ['mixed2[0][0]']
mixed3 (Concatenate) (None, 14, 14, 768) 0 ['activation_26[0][0]',
'activation_29[0][0]',
'max_pooling2d_2[0][0]']
conv2d_34 (Conv2D) (None, 14, 14, 128) 98304 ['mixed3[0][0]']
batch_normalization_34 (BatchN (None, 14, 14, 128) 384 ['conv2d_34[0][0]']
ormalization)
activation_34 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_34[0][0]']
conv2d_35 (Conv2D) (None, 14, 14, 128) 114688 ['activation_34[0][0]']
batch_normalization_35 (BatchN (None, 14, 14, 128) 384 ['conv2d_35[0][0]']
ormalization)
activation_35 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_35[0][0]']
conv2d_31 (Conv2D) (None, 14, 14, 128) 98304 ['mixed3[0][0]']
conv2d_36 (Conv2D) (None, 14, 14, 128) 114688 ['activation_35[0][0]']
batch_normalization_31 (BatchN (None, 14, 14, 128) 384 ['conv2d_31[0][0]']
ormalization)
batch_normalization_36 (BatchN (None, 14, 14, 128) 384 ['conv2d_36[0][0]']
ormalization)
activation_31 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_31[0][0]']
activation_36 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_36[0][0]']
conv2d_32 (Conv2D) (None, 14, 14, 128) 114688 ['activation_31[0][0]']
conv2d_37 (Conv2D) (None, 14, 14, 128) 114688 ['activation_36[0][0]']
batch_normalization_32 (BatchN (None, 14, 14, 128) 384 ['conv2d_32[0][0]']
ormalization)
batch_normalization_37 (BatchN (None, 14, 14, 128) 384 ['conv2d_37[0][0]']
ormalization)
activation_32 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_32[0][0]']
activation_37 (Activation) (None, 14, 14, 128) 0 ['batch_normalization_37[0][0]']
average_pooling2d_3 (AveragePo (None, 14, 14, 768) 0 ['mixed3[0][0]']
oling2D)
conv2d_30 (Conv2D) (None, 14, 14, 192) 147456 ['mixed3[0][0]']
conv2d_33 (Conv2D) (None, 14, 14, 192) 172032 ['activation_32[0][0]']
conv2d_38 (Conv2D) (None, 14, 14, 192) 172032 ['activation_37[0][0]']
conv2d_39 (Conv2D) (None, 14, 14, 192) 147456 ['average_pooling2d_3[0][0]']
batch_normalization_30 (BatchN (None, 14, 14, 192) 576 ['conv2d_30[0][0]']
ormalization)
batch_normalization_33 (BatchN (None, 14, 14, 192) 576 ['conv2d_33[0][0]']
ormalization)
batch_normalization_38 (BatchN (None, 14, 14, 192) 576 ['conv2d_38[0][0]']
ormalization)
batch_normalization_39 (BatchN (None, 14, 14, 192) 576 ['conv2d_39[0][0]']
ormalization)
activation_30 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_30[0][0]']
activation_33 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_33[0][0]']
activation_38 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_38[0][0]']
activation_39 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_39[0][0]']
mixed4 (Concatenate) (None, 14, 14, 768) 0 ['activation_30[0][0]',
'activation_33[0][0]',
'activation_38[0][0]',
'activation_39[0][0]']
conv2d_44 (Conv2D) (None, 14, 14, 160) 122880 ['mixed4[0][0]']
batch_normalization_44 (BatchN (None, 14, 14, 160) 480 ['conv2d_44[0][0]']
ormalization)
activation_44 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_44[0][0]']
conv2d_45 (Conv2D) (None, 14, 14, 160) 179200 ['activation_44[0][0]']
batch_normalization_45 (BatchN (None, 14, 14, 160) 480 ['conv2d_45[0][0]']
ormalization)
activation_45 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_45[0][0]']
conv2d_41 (Conv2D) (None, 14, 14, 160) 122880 ['mixed4[0][0]']
conv2d_46 (Conv2D) (None, 14, 14, 160) 179200 ['activation_45[0][0]']
batch_normalization_41 (BatchN (None, 14, 14, 160) 480 ['conv2d_41[0][0]']
ormalization)
batch_normalization_46 (BatchN (None, 14, 14, 160) 480 ['conv2d_46[0][0]']
ormalization)
activation_41 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_41[0][0]']
activation_46 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_46[0][0]']
conv2d_42 (Conv2D) (None, 14, 14, 160) 179200 ['activation_41[0][0]']
conv2d_47 (Conv2D) (None, 14, 14, 160) 179200 ['activation_46[0][0]']
batch_normalization_42 (BatchN (None, 14, 14, 160) 480 ['conv2d_42[0][0]']
ormalization)
batch_normalization_47 (BatchN (None, 14, 14, 160) 480 ['conv2d_47[0][0]']
ormalization)
activation_42 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_42[0][0]']
activation_47 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_47[0][0]']
average_pooling2d_4 (AveragePo (None, 14, 14, 768) 0 ['mixed4[0][0]']
oling2D)
conv2d_40 (Conv2D) (None, 14, 14, 192) 147456 ['mixed4[0][0]']
conv2d_43 (Conv2D) (None, 14, 14, 192) 215040 ['activation_42[0][0]']
conv2d_48 (Conv2D) (None, 14, 14, 192) 215040 ['activation_47[0][0]']
conv2d_49 (Conv2D) (None, 14, 14, 192) 147456 ['average_pooling2d_4[0][0]']
batch_normalization_40 (BatchN (None, 14, 14, 192) 576 ['conv2d_40[0][0]']
ormalization)
batch_normalization_43 (BatchN (None, 14, 14, 192) 576 ['conv2d_43[0][0]']
ormalization)
batch_normalization_48 (BatchN (None, 14, 14, 192) 576 ['conv2d_48[0][0]']
ormalization)
batch_normalization_49 (BatchN (None, 14, 14, 192) 576 ['conv2d_49[0][0]']
ormalization)
activation_40 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_40[0][0]']
activation_43 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_43[0][0]']
activation_48 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_48[0][0]']
activation_49 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_49[0][0]']
mixed5 (Concatenate) (None, 14, 14, 768) 0 ['activation_40[0][0]',
'activation_43[0][0]',
'activation_48[0][0]',
'activation_49[0][0]']
conv2d_54 (Conv2D) (None, 14, 14, 160) 122880 ['mixed5[0][0]']
batch_normalization_54 (BatchN (None, 14, 14, 160) 480 ['conv2d_54[0][0]']
ormalization)
activation_54 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_54[0][0]']
conv2d_55 (Conv2D) (None, 14, 14, 160) 179200 ['activation_54[0][0]']
batch_normalization_55 (BatchN (None, 14, 14, 160) 480 ['conv2d_55[0][0]']
ormalization)
activation_55 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_55[0][0]']
conv2d_51 (Conv2D) (None, 14, 14, 160) 122880 ['mixed5[0][0]']
conv2d_56 (Conv2D) (None, 14, 14, 160) 179200 ['activation_55[0][0]']
batch_normalization_51 (BatchN (None, 14, 14, 160) 480 ['conv2d_51[0][0]']
ormalization)
batch_normalization_56 (BatchN (None, 14, 14, 160) 480 ['conv2d_56[0][0]']
ormalization)
activation_51 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_51[0][0]']
activation_56 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_56[0][0]']
conv2d_52 (Conv2D) (None, 14, 14, 160) 179200 ['activation_51[0][0]']
conv2d_57 (Conv2D) (None, 14, 14, 160) 179200 ['activation_56[0][0]']
batch_normalization_52 (BatchN (None, 14, 14, 160) 480 ['conv2d_52[0][0]']
ormalization)
batch_normalization_57 (BatchN (None, 14, 14, 160) 480 ['conv2d_57[0][0]']
ormalization)
activation_52 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_52[0][0]']
activation_57 (Activation) (None, 14, 14, 160) 0 ['batch_normalization_57[0][0]']
average_pooling2d_5 (AveragePo (None, 14, 14, 768) 0 ['mixed5[0][0]']
oling2D)
conv2d_50 (Conv2D) (None, 14, 14, 192) 147456 ['mixed5[0][0]']
conv2d_53 (Conv2D) (None, 14, 14, 192) 215040 ['activation_52[0][0]']
conv2d_58 (Conv2D) (None, 14, 14, 192) 215040 ['activation_57[0][0]']
conv2d_59 (Conv2D) (None, 14, 14, 192) 147456 ['average_pooling2d_5[0][0]']
batch_normalization_50 (BatchN (None, 14, 14, 192) 576 ['conv2d_50[0][0]']
ormalization)
batch_normalization_53 (BatchN (None, 14, 14, 192) 576 ['conv2d_53[0][0]']
ormalization)
batch_normalization_58 (BatchN (None, 14, 14, 192) 576 ['conv2d_58[0][0]']
ormalization)
batch_normalization_59 (BatchN (None, 14, 14, 192) 576 ['conv2d_59[0][0]']
ormalization)
activation_50 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_50[0][0]']
activation_53 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_53[0][0]']
activation_58 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_58[0][0]']
activation_59 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_59[0][0]']
mixed6 (Concatenate) (None, 14, 14, 768) 0 ['activation_50[0][0]',
'activation_53[0][0]',
'activation_58[0][0]',
'activation_59[0][0]']
conv2d_64 (Conv2D) (None, 14, 14, 192) 147456 ['mixed6[0][0]']
batch_normalization_64 (BatchN (None, 14, 14, 192) 576 ['conv2d_64[0][0]']
ormalization)
activation_64 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_64[0][0]']
conv2d_65 (Conv2D) (None, 14, 14, 192) 258048 ['activation_64[0][0]']
batch_normalization_65 (BatchN (None, 14, 14, 192) 576 ['conv2d_65[0][0]']
ormalization)
activation_65 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_65[0][0]']
conv2d_61 (Conv2D) (None, 14, 14, 192) 147456 ['mixed6[0][0]']
conv2d_66 (Conv2D) (None, 14, 14, 192) 258048 ['activation_65[0][0]']
batch_normalization_61 (BatchN (None, 14, 14, 192) 576 ['conv2d_61[0][0]']
ormalization)
batch_normalization_66 (BatchN (None, 14, 14, 192) 576 ['conv2d_66[0][0]']
ormalization)
activation_61 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_61[0][0]']
activation_66 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_66[0][0]']
conv2d_62 (Conv2D) (None, 14, 14, 192) 258048 ['activation_61[0][0]']
conv2d_67 (Conv2D) (None, 14, 14, 192) 258048 ['activation_66[0][0]']
batch_normalization_62 (BatchN (None, 14, 14, 192) 576 ['conv2d_62[0][0]']
ormalization)
batch_normalization_67 (BatchN (None, 14, 14, 192) 576 ['conv2d_67[0][0]']
ormalization)
activation_62 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_62[0][0]']
activation_67 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_67[0][0]']
average_pooling2d_6 (AveragePo (None, 14, 14, 768) 0 ['mixed6[0][0]']
oling2D)
conv2d_60 (Conv2D) (None, 14, 14, 192) 147456 ['mixed6[0][0]']
conv2d_63 (Conv2D) (None, 14, 14, 192) 258048 ['activation_62[0][0]']
conv2d_68 (Conv2D) (None, 14, 14, 192) 258048 ['activation_67[0][0]']
conv2d_69 (Conv2D) (None, 14, 14, 192) 147456 ['average_pooling2d_6[0][0]']
batch_normalization_60 (BatchN (None, 14, 14, 192) 576 ['conv2d_60[0][0]']
ormalization)
batch_normalization_63 (BatchN (None, 14, 14, 192) 576 ['conv2d_63[0][0]']
ormalization)
batch_normalization_68 (BatchN (None, 14, 14, 192) 576 ['conv2d_68[0][0]']
ormalization)
batch_normalization_69 (BatchN (None, 14, 14, 192) 576 ['conv2d_69[0][0]']
ormalization)
activation_60 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_60[0][0]']
activation_63 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_63[0][0]']
activation_68 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_68[0][0]']
activation_69 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_69[0][0]']
mixed7 (Concatenate) (None, 14, 14, 768) 0 ['activation_60[0][0]',
'activation_63[0][0]',
'activation_68[0][0]',
'activation_69[0][0]']
conv2d_72 (Conv2D) (None, 14, 14, 192) 147456 ['mixed7[0][0]']
batch_normalization_72 (BatchN (None, 14, 14, 192) 576 ['conv2d_72[0][0]']
ormalization)
activation_72 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_72[0][0]']
conv2d_73 (Conv2D) (None, 14, 14, 192) 258048 ['activation_72[0][0]']
batch_normalization_73 (BatchN (None, 14, 14, 192) 576 ['conv2d_73[0][0]']
ormalization)
activation_73 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_73[0][0]']
conv2d_70 (Conv2D) (None, 14, 14, 192) 147456 ['mixed7[0][0]']
conv2d_74 (Conv2D) (None, 14, 14, 192) 258048 ['activation_73[0][0]']
batch_normalization_70 (BatchN (None, 14, 14, 192) 576 ['conv2d_70[0][0]']
ormalization)
batch_normalization_74 (BatchN (None, 14, 14, 192) 576 ['conv2d_74[0][0]']
ormalization)
activation_70 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_70[0][0]']
activation_74 (Activation) (None, 14, 14, 192) 0 ['batch_normalization_74[0][0]']
conv2d_71 (Conv2D) (None, 6, 6, 320) 552960 ['activation_70[0][0]']
conv2d_75 (Conv2D) (None, 6, 6, 192) 331776 ['activation_74[0][0]']
batch_normalization_71 (BatchN (None, 6, 6, 320) 960 ['conv2d_71[0][0]']
ormalization)
batch_normalization_75 (BatchN (None, 6, 6, 192) 576 ['conv2d_75[0][0]']
ormalization)
activation_71 (Activation) (None, 6, 6, 320) 0 ['batch_normalization_71[0][0]']
activation_75 (Activation) (None, 6, 6, 192) 0 ['batch_normalization_75[0][0]']
max_pooling2d_3 (MaxPooling2D) (None, 6, 6, 768) 0 ['mixed7[0][0]']
mixed8 (Concatenate) (None, 6, 6, 1280) 0 ['activation_71[0][0]',
'activation_75[0][0]',
'max_pooling2d_3[0][0]']
conv2d_80 (Conv2D) (None, 6, 6, 448) 573440 ['mixed8[0][0]']
batch_normalization_80 (BatchN (None, 6, 6, 448) 1344 ['conv2d_80[0][0]']
ormalization)
activation_80 (Activation) (None, 6, 6, 448) 0 ['batch_normalization_80[0][0]']
conv2d_77 (Conv2D) (None, 6, 6, 384) 491520 ['mixed8[0][0]']
conv2d_81 (Conv2D) (None, 6, 6, 384) 1548288 ['activation_80[0][0]']
batch_normalization_77 (BatchN (None, 6, 6, 384) 1152 ['conv2d_77[0][0]']
ormalization)
batch_normalization_81 (BatchN (None, 6, 6, 384) 1152 ['conv2d_81[0][0]']
ormalization)
activation_77 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_77[0][0]']
activation_81 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_81[0][0]']
conv2d_78 (Conv2D) (None, 6, 6, 384) 442368 ['activation_77[0][0]']
conv2d_79 (Conv2D) (None, 6, 6, 384) 442368 ['activation_77[0][0]']
conv2d_82 (Conv2D) (None, 6, 6, 384) 442368 ['activation_81[0][0]']
conv2d_83 (Conv2D) (None, 6, 6, 384) 442368 ['activation_81[0][0]']
average_pooling2d_7 (AveragePo (None, 6, 6, 1280) 0 ['mixed8[0][0]']
oling2D)
conv2d_76 (Conv2D) (None, 6, 6, 320) 409600 ['mixed8[0][0]']
batch_normalization_78 (BatchN (None, 6, 6, 384) 1152 ['conv2d_78[0][0]']
ormalization)
batch_normalization_79 (BatchN (None, 6, 6, 384) 1152 ['conv2d_79[0][0]']
ormalization)
batch_normalization_82 (BatchN (None, 6, 6, 384) 1152 ['conv2d_82[0][0]']
ormalization)
batch_normalization_83 (BatchN (None, 6, 6, 384) 1152 ['conv2d_83[0][0]']
ormalization)
conv2d_84 (Conv2D) (None, 6, 6, 192) 245760 ['average_pooling2d_7[0][0]']
batch_normalization_76 (BatchN (None, 6, 6, 320) 960 ['conv2d_76[0][0]']
ormalization)
activation_78 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_78[0][0]']
activation_79 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_79[0][0]']
activation_82 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_82[0][0]']
activation_83 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_83[0][0]']
batch_normalization_84 (BatchN (None, 6, 6, 192) 576 ['conv2d_84[0][0]']
ormalization)
activation_76 (Activation) (None, 6, 6, 320) 0 ['batch_normalization_76[0][0]']
mixed9_0 (Concatenate) (None, 6, 6, 768) 0 ['activation_78[0][0]',
'activation_79[0][0]']
concatenate (Concatenate) (None, 6, 6, 768) 0 ['activation_82[0][0]',
'activation_83[0][0]']
activation_84 (Activation) (None, 6, 6, 192) 0 ['batch_normalization_84[0][0]']
mixed9 (Concatenate) (None, 6, 6, 2048) 0 ['activation_76[0][0]',
'mixed9_0[0][0]',
'concatenate[0][0]',
'activation_84[0][0]']
conv2d_89 (Conv2D) (None, 6, 6, 448) 917504 ['mixed9[0][0]']
batch_normalization_89 (BatchN (None, 6, 6, 448) 1344 ['conv2d_89[0][0]']
ormalization)
activation_89 (Activation) (None, 6, 6, 448) 0 ['batch_normalization_89[0][0]']
conv2d_86 (Conv2D) (None, 6, 6, 384) 786432 ['mixed9[0][0]']
conv2d_90 (Conv2D) (None, 6, 6, 384) 1548288 ['activation_89[0][0]']
batch_normalization_86 (BatchN (None, 6, 6, 384) 1152 ['conv2d_86[0][0]']
ormalization)
batch_normalization_90 (BatchN (None, 6, 6, 384) 1152 ['conv2d_90[0][0]']
ormalization)
activation_86 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_86[0][0]']
activation_90 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_90[0][0]']
conv2d_87 (Conv2D) (None, 6, 6, 384) 442368 ['activation_86[0][0]']
conv2d_88 (Conv2D) (None, 6, 6, 384) 442368 ['activation_86[0][0]']
conv2d_91 (Conv2D) (None, 6, 6, 384) 442368 ['activation_90[0][0]']
conv2d_92 (Conv2D) (None, 6, 6, 384) 442368 ['activation_90[0][0]']
average_pooling2d_8 (AveragePo (None, 6, 6, 2048) 0 ['mixed9[0][0]']
oling2D)
conv2d_85 (Conv2D) (None, 6, 6, 320) 655360 ['mixed9[0][0]']
batch_normalization_87 (BatchN (None, 6, 6, 384) 1152 ['conv2d_87[0][0]']
ormalization)
batch_normalization_88 (BatchN (None, 6, 6, 384) 1152 ['conv2d_88[0][0]']
ormalization)
batch_normalization_91 (BatchN (None, 6, 6, 384) 1152 ['conv2d_91[0][0]']
ormalization)
batch_normalization_92 (BatchN (None, 6, 6, 384) 1152 ['conv2d_92[0][0]']
ormalization)
conv2d_93 (Conv2D) (None, 6, 6, 192) 393216 ['average_pooling2d_8[0][0]']
batch_normalization_85 (BatchN (None, 6, 6, 320) 960 ['conv2d_85[0][0]']
ormalization)
activation_87 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_87[0][0]']
activation_88 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_88[0][0]']
activation_91 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_91[0][0]']
activation_92 (Activation) (None, 6, 6, 384) 0 ['batch_normalization_92[0][0]']
batch_normalization_93 (BatchN (None, 6, 6, 192) 576 ['conv2d_93[0][0]']
ormalization)
activation_85 (Activation) (None, 6, 6, 320) 0 ['batch_normalization_85[0][0]']
mixed9_1 (Concatenate) (None, 6, 6, 768) 0 ['activation_87[0][0]',
'activation_88[0][0]']
concatenate_1 (Concatenate) (None, 6, 6, 768) 0 ['activation_91[0][0]',
'activation_92[0][0]']
activation_93 (Activation) (None, 6, 6, 192) 0 ['batch_normalization_93[0][0]']
mixed10 (Concatenate) (None, 6, 6, 2048) 0 ['activation_85[0][0]',
'mixed9_1[0][0]',
'concatenate_1[0][0]',
'activation_93[0][0]']
flatten_1 (Flatten) (None, 73728) 0 ['mixed10[0][0]']
==================================================================================================
Total params: 21,802,784
Trainable params: 0
Non-trainable params: 21,802,784
__________________________________________________________________________________________________
Model Compilation
from keras.models import Sequential
from keras.layers import Dropout
input_shape=(256,256,3)
incept_model = Sequential()
incept_model.add(inception_model)
incept_model.add(Dense(512, activation='relu', input_dim=input_shape))
incept_model.add(Dropout(0.3))
incept_model.add(Dense(512, activation='relu'))
incept_model.add(Dropout(0.3))
incept_model.add(Dense(1, activation='sigmoid'))
incept_model.compile(optimizer = 'adam',
loss = 'binary_crossentropy',
metrics = ['accuracy'])
incept_model.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
model_1 (Functional) (None, 73728) 21802784
dense_3 (Dense) (None, 512) 37749248
dropout_2 (Dropout) (None, 512) 0
dense_4 (Dense) (None, 512) 262656
dropout_3 (Dropout) (None, 512) 0
dense_5 (Dense) (None, 1) 513
=================================================================
Total params: 59,815,201
Trainable params: 38,012,417
Non-trainable params: 21,802,784
_________________________________________________________________
Model Training
incept_model.fit(train,epochs=20,callbacks=[reduce_lr],steps_per_epoch=100,validation_data=test,class_weight=class_weight)
Epoch 1/20 100/100 [==============================] - 433s 4s/step - loss: 3.9842 - accuracy: 0.6158 - val_loss: 0.4879 - val_accuracy: 0.7391 - lr: 0.0010 Epoch 2/20 100/100 [==============================] - 363s 4s/step - loss: 0.8952 - accuracy: 0.6480 - val_loss: 0.5133 - val_accuracy: 0.7241 - lr: 0.0010 Epoch 3/20 100/100 [==============================] - 371s 4s/step - loss: 0.8695 - accuracy: 0.6482 - val_loss: 0.4649 - val_accuracy: 0.7605 - lr: 0.0010 Epoch 4/20 100/100 [==============================] - 375s 4s/step - loss: 0.8772 - accuracy: 0.6358 - val_loss: 0.4593 - val_accuracy: 0.7691 - lr: 0.0010 Epoch 5/20 100/100 [==============================] - ETA: 0s - loss: 0.8864 - accuracy: 0.6071 Epoch 5: ReduceLROnPlateau reducing learning rate to 0.0005000000237487257. 100/100 [==============================] - 374s 4s/step - loss: 0.8864 - accuracy: 0.6071 - val_loss: 0.4569 - val_accuracy: 0.7620 - lr: 0.0010 Epoch 6/20 100/100 [==============================] - 390s 4s/step - loss: 0.8630 - accuracy: 0.5952 - val_loss: 0.4468 - val_accuracy: 0.7601 - lr: 5.0000e-04 Epoch 7/20 100/100 [==============================] - ETA: 0s - loss: 0.8504 - accuracy: 0.6206 Epoch 7: ReduceLROnPlateau reducing learning rate to 0.0002500000118743628. 100/100 [==============================] - 378s 4s/step - loss: 0.8504 - accuracy: 0.6206 - val_loss: 0.4546 - val_accuracy: 0.7440 - lr: 5.0000e-04 Epoch 8/20 100/100 [==============================] - 344s 3s/step - loss: 0.8442 - accuracy: 0.6459 - val_loss: 0.4458 - val_accuracy: 0.7755 - lr: 2.5000e-04 Epoch 9/20 100/100 [==============================] - ETA: 0s - loss: 0.8190 - accuracy: 0.6397 Epoch 9: ReduceLROnPlateau reducing learning rate to 0.0001250000059371814. 100/100 [==============================] - 342s 3s/step - loss: 0.8190 - accuracy: 0.6397 - val_loss: 0.4458 - val_accuracy: 0.7699 - lr: 2.5000e-04 Epoch 10/20 100/100 [==============================] - 324s 3s/step - loss: 0.8002 - accuracy: 0.6380 - val_loss: 0.4254 - val_accuracy: 0.7740 - lr: 1.2500e-04 Epoch 11/20 100/100 [==============================] - 337s 3s/step - loss: 0.8006 - accuracy: 0.6523 - val_loss: 0.4724 - val_accuracy: 0.7241 - lr: 1.2500e-04 Epoch 12/20 100/100 [==============================] - 339s 3s/step - loss: 0.8036 - accuracy: 0.6495 - val_loss: 0.4379 - val_accuracy: 0.7732 - lr: 1.2500e-04 Epoch 13/20 100/100 [==============================] - 349s 3s/step - loss: 0.8072 - accuracy: 0.6546 - val_loss: 0.4570 - val_accuracy: 0.7605 - lr: 1.2500e-04 Epoch 14/20 100/100 [==============================] - 329s 3s/step - loss: 0.7985 - accuracy: 0.6623 - val_loss: 0.4406 - val_accuracy: 0.7699 - lr: 1.2500e-04 Epoch 15/20 100/100 [==============================] - 332s 3s/step - loss: 0.8059 - accuracy: 0.6477 - val_loss: 0.4777 - val_accuracy: 0.7414 - lr: 1.2500e-04 Epoch 16/20 100/100 [==============================] - ETA: 0s - loss: 0.7857 - accuracy: 0.6612 Epoch 16: ReduceLROnPlateau reducing learning rate to 6.25000029685907e-05. 100/100 [==============================] - 331s 3s/step - loss: 0.7857 - accuracy: 0.6612 - val_loss: 0.4131 - val_accuracy: 0.8006 - lr: 1.2500e-04 Epoch 17/20 100/100 [==============================] - 322s 3s/step - loss: 0.7859 - accuracy: 0.6610 - val_loss: 0.4531 - val_accuracy: 0.7642 - lr: 6.2500e-05 Epoch 18/20 100/100 [==============================] - ETA: 0s - loss: 0.7927 - accuracy: 0.6579 Epoch 18: ReduceLROnPlateau reducing learning rate to 3.125000148429535e-05. 100/100 [==============================] - 341s 3s/step - loss: 0.7927 - accuracy: 0.6579 - val_loss: 0.4489 - val_accuracy: 0.7687 - lr: 6.2500e-05 Epoch 19/20 100/100 [==============================] - 350s 4s/step - loss: 0.7859 - accuracy: 0.6632 - val_loss: 0.4501 - val_accuracy: 0.7687 - lr: 3.1250e-05 Epoch 20/20 100/100 [==============================] - 358s 4s/step - loss: 0.7799 - accuracy: 0.6662 - val_loss: 0.4458 - val_accuracy: 0.7627 - lr: 3.1250e-05
<keras.callbacks.History at 0x7f1f4fbd5490>
Saving the Model
incept_model.save(f'/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/inceptionnet_model.h5')
Plotting & Validation Accuracy
plt.figure(figsize=(30,20))
incval_acc=np.asarray(incept_model.history.history['val_accuracy'])*100
incacc=np.asarray(incept_model.history.history['accuracy'])*100
incacc=pd.DataFrame({'val_acc':incval_acc,'acc':incacc})
incacc.plot(figsize=(20,10),yticks=range(50,100,5))
<matplotlib.axes._subplots.AxesSubplot at 0x7f1f4ecb19d0>
<Figure size 2160x1440 with 0 Axes>
Plotting & Validation Loss
inc_loss=incept_model.history.history['loss']
inc_val_loss=incept_model.history.history['val_loss']
inc_loss=pd.DataFrame({'val_loss':inc_val_loss,'loss':inc_loss})
inc_loss.plot(figsize=(20,10))
<matplotlib.axes._subplots.AxesSubplot at 0x7f1f5097d850>
Model testing
Testing with saved Model
#load the model we saved
inceptionnet_loaded_model = load_model('/content/drive/MyDrive/Colab Notebooks/Capstone Project/pneumonia_status/inceptionnet_model.h5')
inceptionnet_loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
y=[]
test.reset()
for i in tqdm(range(4)):
_,target=test.__getitem__(i)
for j in target:
y.append(j)
100%|██████████| 4/4 [00:02<00:00, 1.92it/s]
test.reset()
y_predi=inceptionnet_loaded_model.predict(test)
predi=[]
for i in y_predi:
if i[0]>=0.5:
predi.append(1)
else:
predi.append(0)
Classification report & ROC Curve
print(classification_report(y,predi[:len(y)]))
precision recall f1-score support
0.0 0.94 0.78 0.85 98
1.0 0.53 0.83 0.65 30
accuracy 0.79 128
macro avg 0.74 0.80 0.75 128
weighted avg 0.84 0.79 0.80 128
plt.figure(figsize=(20,10))
fpri,tpri,_=roc_curve(y,y_predi[:len(y)])
area_under_curvei=auc(fpri,tpri)
print('The area under the curve is:',area_under_curvei)
# Plot area under curve
plt.plot(fpri,tpri,'b.-')
plt.xlabel('False positive rate')
plt.ylabel('True positive rate')
plt.plot(fpri,fpri,linestyle='--',color='black')
The area under the curve is: 0.8702380952380953
[<matplotlib.lines.Line2D at 0x7f1f4e3eafd0>]
Locate the position of inflammation in an image
import math
import os
import shutil
import sys
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
import glob
!pip install pydicom
import pydicom
from sklearn.model_selection import train_test_split
from tqdm import tqdm
import warnings
warnings.simplefilter(action="ignore", category=FutureWarning)
np.random.seed(5)
Collecting pydicom
Downloading pydicom-2.2.2-py3-none-any.whl (2.0 MB)
|████████████████████████████████| 2.0 MB 5.2 MB/s
Installing collected packages: pydicom
Successfully installed pydicom-2.2.2
#OpenCV and dependencies
!sudo apt-get install libopencv-dev python3-opencv
Reading package lists... Done Building dependency tree Reading state information... Done libopencv-dev is already the newest version (3.2.0+dfsg-4ubuntu0.1). Suggested packages: python-numpy-doc python3-nose python3-numpy-dbg The following NEW packages will be installed: python3-numpy python3-opencv 0 upgraded, 2 newly installed, 0 to remove and 39 not upgraded. Need to get 2,477 kB of archives. After this operation, 13.9 MB of additional disk space will be used. Get:1 http://archive.ubuntu.com/ubuntu bionic/main amd64 python3-numpy amd64 1:1.13.3-2ubuntu1 [1,943 kB] Get:2 http://archive.ubuntu.com/ubuntu bionic-updates/universe amd64 python3-opencv amd64 3.2.0+dfsg-4ubuntu0.1 [534 kB] Fetched 2,477 kB in 0s (5,426 kB/s) debconf: unable to initialize frontend: Dialog debconf: (No usable dialog-like program is installed, so the dialog based frontend cannot be used. at /usr/share/perl5/Debconf/FrontEnd/Dialog.pm line 76, <> line 2.) debconf: falling back to frontend: Readline debconf: unable to initialize frontend: Readline debconf: (This frontend requires a controlling tty.) debconf: falling back to frontend: Teletype dpkg-preconfigure: unable to re-open stdin: Selecting previously unselected package python3-numpy. (Reading database ... 155335 files and directories currently installed.) Preparing to unpack .../python3-numpy_1%3a1.13.3-2ubuntu1_amd64.deb ... Unpacking python3-numpy (1:1.13.3-2ubuntu1) ... Selecting previously unselected package python3-opencv. Preparing to unpack .../python3-opencv_3.2.0+dfsg-4ubuntu0.1_amd64.deb ... Unpacking python3-opencv (3.2.0+dfsg-4ubuntu0.1) ... Setting up python3-numpy (1:1.13.3-2ubuntu1) ... Setting up python3-opencv (3.2.0+dfsg-4ubuntu0.1) ... Processing triggers for man-db (2.8.3-2ubuntu0.1) ...
1. Clone and Build YOLOv3
!git clone https://github.com/pjreddie/darknet
Cloning into 'darknet'... remote: Enumerating objects: 5946, done. remote: Total 5946 (delta 0), reused 0 (delta 0), pack-reused 5946 Receiving objects: 100% (5946/5946), 6.37 MiB | 24.98 MiB/s, done. Resolving deltas: 100% (3928/3928), done.
#changing working directory to Darknet
%cd darknet
/content/darknet
!make
mkdir -p obj mkdir -p backup mkdir -p results gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/gemm.c -o obj/gemm.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/utils.c -o obj/utils.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/cuda.c -o obj/cuda.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/deconvolutional_layer.c -o obj/deconvolutional_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/convolutional_layer.c -o obj/convolutional_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/list.c -o obj/list.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/image.c -o obj/image.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/activations.c -o obj/activations.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/im2col.c -o obj/im2col.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/col2im.c -o obj/col2im.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/blas.c -o obj/blas.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/crop_layer.c -o obj/crop_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/dropout_layer.c -o obj/dropout_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/maxpool_layer.c -o obj/maxpool_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/softmax_layer.c -o obj/softmax_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/data.c -o obj/data.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/matrix.c -o obj/matrix.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/network.c -o obj/network.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/connected_layer.c -o obj/connected_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/cost_layer.c -o obj/cost_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/parser.c -o obj/parser.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/option_list.c -o obj/option_list.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/detection_layer.c -o obj/detection_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/route_layer.c -o obj/route_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/upsample_layer.c -o obj/upsample_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/box.c -o obj/box.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/normalization_layer.c -o obj/normalization_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/avgpool_layer.c -o obj/avgpool_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/layer.c -o obj/layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/local_layer.c -o obj/local_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/shortcut_layer.c -o obj/shortcut_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/logistic_layer.c -o obj/logistic_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/activation_layer.c -o obj/activation_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/rnn_layer.c -o obj/rnn_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/gru_layer.c -o obj/gru_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/crnn_layer.c -o obj/crnn_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/demo.c -o obj/demo.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/batchnorm_layer.c -o obj/batchnorm_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/region_layer.c -o obj/region_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/reorg_layer.c -o obj/reorg_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/tree.c -o obj/tree.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/lstm_layer.c -o obj/lstm_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/l2norm_layer.c -o obj/l2norm_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/yolo_layer.c -o obj/yolo_layer.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/iseg_layer.c -o obj/iseg_layer.o g++ -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./src/image_opencv.cpp -o obj/image_opencv.o gcc -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -shared obj/gemm.o obj/utils.o obj/cuda.o obj/deconvolutional_layer.o obj/convolutional_layer.o obj/list.o obj/image.o obj/activations.o obj/im2col.o obj/col2im.o obj/blas.o obj/crop_layer.o obj/dropout_layer.o obj/maxpool_layer.o obj/softmax_layer.o obj/data.o obj/matrix.o obj/network.o obj/connected_layer.o obj/cost_layer.o obj/parser.o obj/option_list.o obj/detection_layer.o obj/route_layer.o obj/upsample_layer.o obj/box.o obj/normalization_layer.o obj/avgpool_layer.o obj/layer.o obj/local_layer.o obj/shortcut_layer.o obj/logistic_layer.o obj/activation_layer.o obj/rnn_layer.o obj/gru_layer.o obj/crnn_layer.o obj/demo.o obj/batchnorm_layer.o obj/region_layer.o obj/reorg_layer.o obj/tree.o obj/lstm_layer.o obj/l2norm_layer.o obj/yolo_layer.o obj/iseg_layer.o obj/image_opencv.o -o libdarknet.so -lm -pthread ar rcs libdarknet.a obj/gemm.o obj/utils.o obj/cuda.o obj/deconvolutional_layer.o obj/convolutional_layer.o obj/list.o obj/image.o obj/activations.o obj/im2col.o obj/col2im.o obj/blas.o obj/crop_layer.o obj/dropout_layer.o obj/maxpool_layer.o obj/softmax_layer.o obj/data.o obj/matrix.o obj/network.o obj/connected_layer.o obj/cost_layer.o obj/parser.o obj/option_list.o obj/detection_layer.o obj/route_layer.o obj/upsample_layer.o obj/box.o obj/normalization_layer.o obj/avgpool_layer.o obj/layer.o obj/local_layer.o obj/shortcut_layer.o obj/logistic_layer.o obj/activation_layer.o obj/rnn_layer.o obj/gru_layer.o obj/crnn_layer.o obj/demo.o obj/batchnorm_layer.o obj/region_layer.o obj/reorg_layer.o obj/tree.o obj/lstm_layer.o obj/l2norm_layer.o obj/yolo_layer.o obj/iseg_layer.o obj/image_opencv.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/captcha.c -o obj/captcha.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/lsd.c -o obj/lsd.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/super.c -o obj/super.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/art.c -o obj/art.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/tag.c -o obj/tag.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/cifar.c -o obj/cifar.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/go.c -o obj/go.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/rnn.c -o obj/rnn.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/segmenter.c -o obj/segmenter.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/regressor.c -o obj/regressor.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/classifier.c -o obj/classifier.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/coco.c -o obj/coco.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/yolo.c -o obj/yolo.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/detector.c -o obj/detector.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/nightmare.c -o obj/nightmare.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/instance-segmenter.c -o obj/instance-segmenter.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast -c ./examples/darknet.c -o obj/darknet.o gcc -Iinclude/ -Isrc/ -Wall -Wno-unused-result -Wno-unknown-pragmas -Wfatal-errors -fPIC -Ofast obj/captcha.o obj/lsd.o obj/super.o obj/art.o obj/tag.o obj/cifar.o obj/go.o obj/rnn.o obj/segmenter.o obj/regressor.o obj/classifier.o obj/coco.o obj/yolo.o obj/detector.o obj/nightmare.o obj/instance-segmenter.o obj/darknet.o libdarknet.a -o darknet -lm -pthread libdarknet.a
2. Data Migration for YOLOv3
2.1. Make subdirectories
#Creating working directories
Project_path = "/content/drive/MyDrive/Colab Notebooks/Capstone Project/"
train_images = os.path.join(Project_path, "stage_2_train_images")
test_images = os.path.join(Project_path, "stage_2_test_images")
images_folder = os.path.join(os.getcwd(), "images") # .jpg
labels_folder = os.path.join(os.getcwd(), "labels") # .txt
metadata_folder = os.path.join(os.getcwd(), "metadata") # .txt
# YOLOv3 config file directory
config_folder = os.path.join(os.getcwd(), "cfg")
# YOLOv3 training checkpoints will be saved here
backup_folder = os.path.join(os.getcwd(), "backup")
for directory in [images_folder, labels_folder, metadata_folder, config_folder, backup_folder]:
if os.path.isdir(directory):
continue
os.mkdir(directory)
!ls -shtl
total 2.3M 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:57 labels 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:57 metadata 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:57 images 716K -rwxr-xr-x 1 root root 714K Mar 24 09:57 darknet 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:57 obj 816K -rw-r--r-- 1 root root 814K Mar 24 09:57 libdarknet.a 612K -rwxr-xr-x 1 root root 609K Mar 24 09:57 libdarknet.so 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:56 results 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:56 backup 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:56 src 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:56 scripts 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:56 examples 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:56 include 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:56 python 4.0K drwxr-xr-x 3 root root 4.0K Mar 24 09:56 data 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 09:56 cfg 36K -rw-r--r-- 1 root root 35K Mar 24 09:56 LICENSE.gpl 4.0K -rw-r--r-- 1 root root 360 Mar 24 09:56 LICENSE.meta 4.0K -rw-r--r-- 1 root root 1.1K Mar 24 09:56 LICENSE.mit 4.0K -rw-r--r-- 1 root root 461 Mar 24 09:56 LICENSE.v1 4.0K -rw-r--r-- 1 root root 3.0K Mar 24 09:56 Makefile 4.0K -rw-r--r-- 1 root root 2.7K Mar 24 09:56 README.md 4.0K -rw-r--r-- 1 root root 515 Mar 24 09:56 LICENSE 4.0K -rw-r--r-- 1 root root 474 Mar 24 09:56 LICENSE.fuck 8.0K -rw-r--r-- 1 root root 6.5K Mar 24 09:56 LICENSE.gen
Project_path
'/content/drive/MyDrive/Colab Notebooks/Capstone Project/'
2.1. Load train labels.csv
image_annotation = pd.read_csv(os.path.join(Project_path, "stage_2_train_labels.csv"))
image_annotation.head()
| patientId | x | y | width | height | Target | |
|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | NaN | NaN | NaN | NaN | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | NaN | NaN | NaN | NaN | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | NaN | NaN | NaN | NaN | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | NaN | NaN | NaN | NaN | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | 1 |
2.2. Generate images and labels for training YOLOv3
w and h are width and height of bbox, divided by image width and height respectively.
So it is different from the format of label data provided the dataset. We should change it.
#Generate images and labels for training YOLOv3
def save_img_from_dcm(dcm_dir, images_folder, patient_id):
img_fp = os.path.join(images_folder, "{}.jpg".format(patient_id))
if os.path.exists(img_fp):
return
dcm_fp = os.path.join(dcm_dir, "{}.dcm".format(patient_id))
img_1ch = pydicom.read_file(dcm_fp).pixel_array
img_3ch = np.stack([img_1ch]*3, -1)
img_fp = os.path.join(images_folder, "{}.jpg".format(patient_id))
cv2.imwrite(img_fp, img_3ch)
def save_label_from_dcm(label_dir, patient_id, row=None):
# rsna default image size
img_size = 1024
label_fp = os.path.join(labels_folder, "{}.txt".format(patient_id))
f = open(label_fp, "a")
if row is None:
f.close()
return
top_left_x = row[1]
top_left_y = row[2]
w = row[3]
h = row[4]
# 'r' means relative. 'c' means center.
rx = top_left_x/img_size
ry = top_left_y/img_size
rw = w/img_size
rh = h/img_size
rcx = rx+rw/2
rcy = ry+rh/2
line = "{} {} {} {} {}\n".format(0, rcx, rcy, rw, rh)
f.write(line)
f.close()
def save_yolov3_data_from_rsna(dcm_dir, images_folder, labels_folder, image_annotation):
for row in tqdm(image_annotation.values):
patient_id = row[0]
img_fp = os.path.join(images_folder, "{}.jpg".format(patient_id))
if os.path.exists(img_fp):
save_label_from_dcm(labels_folder, patient_id, row)
continue
target = row[5]
# Only files with bbox.
if target == 0:
continue
save_label_from_dcm(labels_folder, patient_id, row)
save_img_from_dcm(dcm_dir, images_folder, patient_id)
import cv2
save_yolov3_data_from_rsna(train_images, images_folder, labels_folder, image_annotation)
100%|██████████| 30227/30227 [19:11<00:00, 26.24it/s]
!du -sh images labels
990M images 24M labels
2.3. Plot a sample train image and label
sample_patient_id = image_annotation[image_annotation.Target == 1].patientId.values[0]
sample_img_path = os.path.join(images_folder, "{}.jpg".format(sample_patient_id))
sample_label_path = os.path.join(labels_folder, "{}.txt".format(sample_patient_id))
plt.imshow(cv2.imread(sample_img_path))
img_size = 1014
with open(sample_label_path, "r") as f:
for line in f:
print(line)
class_id, rcx, rcy, rw, rh = list(map(float, line.strip().split()))
x = (rcx-rw/2)*img_size
y = (rcy-rh/2)*img_size
w = rw*img_size
h = rh*img_size
plt.plot([x, x, x+w, x+w, x], [y, y+h, y+h, y, y])
0 0.36181640625 0.33349609375 0.2080078125 0.3701171875 0 0.36181640625 0.33349609375 0.2080078125 0.3701171875 0 0.673828125 0.36962890625 0.25 0.4423828125
2.4. Generate train and validation file path lists (.txt)
#Function for writing train and validation lists
def write_train_list(metadata_folder, images_folder, name, series):
list_fp = os.path.join(metadata_folder, name)
with open(list_fp, "w") as f:
for patient_id in series:
line = "{}\n".format(os.path.join(images_folder, "{}.jpg".format(patient_id)))
f.write(line)
#Following lines contain data with bbox only
patient_id_series = image_annotation[image_annotation.Target == 1].patientId.drop_duplicates()
train_series, valid_series = train_test_split(patient_id_series, test_size=0.1, random_state=5)
print("The number of train set: {}, The number of validation set: {}".format(train_series.shape[0], valid_series.shape[0]))
#train image path list
write_train_list(metadata_folder, images_folder, "train_list.txt", train_series)
#validation image path list
write_train_list(metadata_folder, images_folder, "valid_list.txt", valid_series)
The number of train set: 5410, The number of validation set: 602
2.5. Create test image and labels for YOLOv3
#Create test images and labels for YOLOv3
def save_yolov3_test_data(test_images, images_folder, metadata_folder, name, series):
list_fp = os.path.join(metadata_folder, name)
with open(list_fp, "w") as f:
for patient_id in series:
save_img_from_dcm(test_images, images_folder, patient_id)
line = "{}\n".format(os.path.join(images_folder, "{}.jpg".format(patient_id)))
f.write(line)
test_dcm_fps = list(set(glob.glob(os.path.join(test_images, '*.dcm'))))
test_dcm_fps = pd.Series(test_dcm_fps).apply(lambda dcm_fp: dcm_fp.strip().split("/")[-1].replace(".dcm",""))
save_yolov3_test_data(test_images, images_folder, metadata_folder, "test_list.txt", test_dcm_fps)
2.6. Plot a sample test Image
sample_patient_id = test_dcm_fps[0]
sample_img_path = os.path.join(images_folder, "{}.jpg".format(sample_patient_id))
plt.imshow(cv2.imread(sample_img_path))
<matplotlib.image.AxesImage at 0x7fba6c795890>
3. Prepare Configuration Files for Using YOLOv3
We should prepare and modify config files, and bring pre-trained weights necessary for training. This proceeds with following four steps.
train: Path to training image list textfile
data_extention_file_path = os.path.join(config_folder, 'rsna.data')
with open(data_extention_file_path, 'w') as f:
contents = """classes= 1
train = {}
valid = {}
names = {}
backup = {}
""".format(os.path.join(metadata_folder, "train_list.txt"),
os.path.join(metadata_folder, "val_list.txt"),
os.path.join(config_folder, 'rsna.names'),
backup_folder)
f.write(contents)
!cat cfg/rsna.data
classes= 1
train = /content/darknet/metadata/train_list.txt
valid = /content/darknet/metadata/val_list.txt
names = /content/darknet/cfg/rsna.names
backup = /content/darknet/backup
# Label list of bounding box.
!echo "pneumonia" > cfg/rsna.names
darknet53.conv.74 (Download Pre-trained Model)
#Download Pre-trained model
!wget -q https://pjreddie.com/media/files/darknet53.conv.74
#Download a cfg file edited for RSNA.
!wget --no-check-certificate -q "https://docs.google.com/uc?export=download&id=18ptTK4Vbeokqpux8Onr0OmwUP9ipmcYO" -O cfg/rsna_yolov3.cfg_train
4. Training YOLOv3
4.0. Training with Pre-trained CNN Weights (darknet53.conv.74)
!./darknet detector train cfg/rsna.data cfg/rsna_yolov3.cfg_train darknet53.conv.74 -i 0 | tee train_log.txt
4.1. Plots of Training Loss
!wget --no-check-certificate -q "https://docs.google.com/uc?export=download&id=1OhnlV3s7r6xsEme6DKkNYjcYjsl-C_Av" -O train_log.txt
iters = []
losses = []
total_losses = []
with open("train_log.txt", 'r') as f:
for i,line in enumerate(f):
if "images" in line:
iters.append(int(line.strip().split()[0].split(":")[0]))
losses.append(float(line.strip().split()[2]))
total_losses.append(float(line.strip().split()[1].split(',')[0]))
plt.figure(figsize=(20, 5))
plt.subplot(1,2,1)
sns.lineplot(iters, total_losses, label="Total loss")
sns.lineplot(iters, losses, label="Avg loss")
plt.xlabel("Iteration")
plt.ylabel("Loss")
plt.subplot(1,2,2)
sns.lineplot(iters, total_losses, label="Total loss")
sns.lineplot(iters, losses, label="Avg loss")
plt.xlabel("Iteration")
plt.ylabel("Loss")
plt.ylim([0, 4.05])
(0.0, 4.05)
5. Trainined YOLOv3 for test images
ex_patient_id = image_annotation[image_annotation.Target == 1].patientId.values[2]
shutil.copy(sample_img_path, "./images/011d6083-abaa-4b28-8a53-36aa0cddb3b5.jpg")
print(sample_patient_id)
033af663-f044-43ab-a569-298cc4bf37bc
5.1. Load trained model (at 15300 iterations)
!wget --load-cookies /tmp/cookies.txt -q "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=1FDzMN-kGVYCvBeDKwemAazldSVkAEFyd' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=1FDzMN-kGVYCvBeDKwemAazldSVkAEFyd" -O backup/rsna_yolov3_15300.weights && rm -rf /tmp/cookies.txt
!ls -alsth backup
total 235M 235M -rw-r--r-- 1 root root 235M Mar 24 09:53 rsna_yolov3_15300.weights 4.0K drwxr-xr-x 16 root root 4.0K Mar 24 09:53 .. 4.0K drwxr-xr-x 2 root root 4.0K Mar 24 08:58 .
5.2. cfg file for testing
!wget --no-check-certificate -q "https://docs.google.com/uc?export=download&id=10Yk6ZMAKGz5LeBbikciALy82aK3lX-57" -O cfg/rsna_yolov3.cfg_test
Detection using a Pre-Trained Model
#Prediction on test image
!./darknet detector test ./cfg/rsna.data ./cfg/rsna_yolov3.cfg_test ./backup/rsna_yolov3_15300.weights ./images/206bdd43-542c-4888-a7e5-4e1932638b0d.jpg -thresh 0.005
layer filters size input output
0 conv 32 3 x 3 / 1 608 x 608 x 3 -> 608 x 608 x 32 0.639 BFLOPs
1 conv 64 3 x 3 / 2 608 x 608 x 32 -> 304 x 304 x 64 3.407 BFLOPs
2 conv 32 1 x 1 / 1 304 x 304 x 64 -> 304 x 304 x 32 0.379 BFLOPs
3 conv 64 3 x 3 / 1 304 x 304 x 32 -> 304 x 304 x 64 3.407 BFLOPs
4 res 1 304 x 304 x 64 -> 304 x 304 x 64
5 conv 128 3 x 3 / 2 304 x 304 x 64 -> 152 x 152 x 128 3.407 BFLOPs
6 conv 64 1 x 1 / 1 152 x 152 x 128 -> 152 x 152 x 64 0.379 BFLOPs
7 conv 128 3 x 3 / 1 152 x 152 x 64 -> 152 x 152 x 128 3.407 BFLOPs
8 res 5 152 x 152 x 128 -> 152 x 152 x 128
9 conv 64 1 x 1 / 1 152 x 152 x 128 -> 152 x 152 x 64 0.379 BFLOPs
10 conv 128 3 x 3 / 1 152 x 152 x 64 -> 152 x 152 x 128 3.407 BFLOPs
11 res 8 152 x 152 x 128 -> 152 x 152 x 128
12 conv 256 3 x 3 / 2 152 x 152 x 128 -> 76 x 76 x 256 3.407 BFLOPs
13 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
14 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
15 res 12 76 x 76 x 256 -> 76 x 76 x 256
16 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
17 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
18 res 15 76 x 76 x 256 -> 76 x 76 x 256
19 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
20 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
21 res 18 76 x 76 x 256 -> 76 x 76 x 256
22 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
23 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
24 res 21 76 x 76 x 256 -> 76 x 76 x 256
25 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
26 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
27 res 24 76 x 76 x 256 -> 76 x 76 x 256
28 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
29 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
30 res 27 76 x 76 x 256 -> 76 x 76 x 256
31 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
32 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
33 res 30 76 x 76 x 256 -> 76 x 76 x 256
34 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
35 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
36 res 33 76 x 76 x 256 -> 76 x 76 x 256
37 conv 512 3 x 3 / 2 76 x 76 x 256 -> 38 x 38 x 512 3.407 BFLOPs
38 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
39 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
40 res 37 38 x 38 x 512 -> 38 x 38 x 512
41 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
42 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
43 res 40 38 x 38 x 512 -> 38 x 38 x 512
44 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
45 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
46 res 43 38 x 38 x 512 -> 38 x 38 x 512
47 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
48 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
49 res 46 38 x 38 x 512 -> 38 x 38 x 512
50 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
51 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
52 res 49 38 x 38 x 512 -> 38 x 38 x 512
53 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
54 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
55 res 52 38 x 38 x 512 -> 38 x 38 x 512
56 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
57 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
58 res 55 38 x 38 x 512 -> 38 x 38 x 512
59 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
60 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
61 res 58 38 x 38 x 512 -> 38 x 38 x 512
62 conv 1024 3 x 3 / 2 38 x 38 x 512 -> 19 x 19 x1024 3.407 BFLOPs
63 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
64 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
65 res 62 19 x 19 x1024 -> 19 x 19 x1024
66 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
67 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
68 res 65 19 x 19 x1024 -> 19 x 19 x1024
69 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
70 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
71 res 68 19 x 19 x1024 -> 19 x 19 x1024
72 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
73 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
74 res 71 19 x 19 x1024 -> 19 x 19 x1024
75 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
76 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
77 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
78 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
79 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
80 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
81 conv 18 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 18 0.013 BFLOPs
82 yolo
83 route 79
84 conv 256 1 x 1 / 1 19 x 19 x 512 -> 19 x 19 x 256 0.095 BFLOPs
85 upsample 2x 19 x 19 x 256 -> 38 x 38 x 256
86 route 85 61
87 conv 256 1 x 1 / 1 38 x 38 x 768 -> 38 x 38 x 256 0.568 BFLOPs
88 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
89 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
90 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
91 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
92 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
93 conv 18 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 18 0.027 BFLOPs
94 yolo
95 route 91
96 conv 128 1 x 1 / 1 38 x 38 x 256 -> 38 x 38 x 128 0.095 BFLOPs
97 upsample 2x 38 x 38 x 128 -> 76 x 76 x 128
98 route 97 36
99 conv 128 1 x 1 / 1 76 x 76 x 384 -> 76 x 76 x 128 0.568 BFLOPs
100 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
101 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
102 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
103 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
104 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
105 conv 18 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 18 0.053 BFLOPs
106 yolo
Loading weights from ./backup/rsna_yolov3_15300.weights...Done!
./images/206bdd43-542c-4888-a7e5-4e1932638b0d.jpg: Predicted in 23.432988 seconds.
pneumonia: 33%
pneumonia: 14%
pneumonia: 1%
Prediction 1
#Plot the prediction with the position of Lung Inflammation
plt.imshow(cv2.imread("predictions.jpg"))
<matplotlib.image.AxesImage at 0x7fba6798d250>
Prediction 2
#Prediction on test image
!./darknet detector test ./cfg/rsna.data ./cfg/rsna_yolov3.cfg_test ./backup/rsna_yolov3_15300.weights ./images/395738ff-53d2-42e6-aa6d-a01e4a07f01f.jpg -thresh 0.25
layer filters size input output
0 conv 32 3 x 3 / 1 608 x 608 x 3 -> 608 x 608 x 32 0.639 BFLOPs
1 conv 64 3 x 3 / 2 608 x 608 x 32 -> 304 x 304 x 64 3.407 BFLOPs
2 conv 32 1 x 1 / 1 304 x 304 x 64 -> 304 x 304 x 32 0.379 BFLOPs
3 conv 64 3 x 3 / 1 304 x 304 x 32 -> 304 x 304 x 64 3.407 BFLOPs
4 res 1 304 x 304 x 64 -> 304 x 304 x 64
5 conv 128 3 x 3 / 2 304 x 304 x 64 -> 152 x 152 x 128 3.407 BFLOPs
6 conv 64 1 x 1 / 1 152 x 152 x 128 -> 152 x 152 x 64 0.379 BFLOPs
7 conv 128 3 x 3 / 1 152 x 152 x 64 -> 152 x 152 x 128 3.407 BFLOPs
8 res 5 152 x 152 x 128 -> 152 x 152 x 128
9 conv 64 1 x 1 / 1 152 x 152 x 128 -> 152 x 152 x 64 0.379 BFLOPs
10 conv 128 3 x 3 / 1 152 x 152 x 64 -> 152 x 152 x 128 3.407 BFLOPs
11 res 8 152 x 152 x 128 -> 152 x 152 x 128
12 conv 256 3 x 3 / 2 152 x 152 x 128 -> 76 x 76 x 256 3.407 BFLOPs
13 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
14 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
15 res 12 76 x 76 x 256 -> 76 x 76 x 256
16 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
17 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
18 res 15 76 x 76 x 256 -> 76 x 76 x 256
19 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
20 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
21 res 18 76 x 76 x 256 -> 76 x 76 x 256
22 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
23 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
24 res 21 76 x 76 x 256 -> 76 x 76 x 256
25 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
26 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
27 res 24 76 x 76 x 256 -> 76 x 76 x 256
28 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
29 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
30 res 27 76 x 76 x 256 -> 76 x 76 x 256
31 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
32 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
33 res 30 76 x 76 x 256 -> 76 x 76 x 256
34 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
35 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
36 res 33 76 x 76 x 256 -> 76 x 76 x 256
37 conv 512 3 x 3 / 2 76 x 76 x 256 -> 38 x 38 x 512 3.407 BFLOPs
38 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
39 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
40 res 37 38 x 38 x 512 -> 38 x 38 x 512
41 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
42 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
43 res 40 38 x 38 x 512 -> 38 x 38 x 512
44 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
45 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
46 res 43 38 x 38 x 512 -> 38 x 38 x 512
47 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
48 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
49 res 46 38 x 38 x 512 -> 38 x 38 x 512
50 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
51 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
52 res 49 38 x 38 x 512 -> 38 x 38 x 512
53 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
54 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
55 res 52 38 x 38 x 512 -> 38 x 38 x 512
56 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
57 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
58 res 55 38 x 38 x 512 -> 38 x 38 x 512
59 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
60 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
61 res 58 38 x 38 x 512 -> 38 x 38 x 512
62 conv 1024 3 x 3 / 2 38 x 38 x 512 -> 19 x 19 x1024 3.407 BFLOPs
63 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
64 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
65 res 62 19 x 19 x1024 -> 19 x 19 x1024
66 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
67 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
68 res 65 19 x 19 x1024 -> 19 x 19 x1024
69 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
70 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
71 res 68 19 x 19 x1024 -> 19 x 19 x1024
72 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
73 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
74 res 71 19 x 19 x1024 -> 19 x 19 x1024
75 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
76 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
77 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
78 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
79 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs
80 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs
81 conv 18 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 18 0.013 BFLOPs
82 yolo
83 route 79
84 conv 256 1 x 1 / 1 19 x 19 x 512 -> 19 x 19 x 256 0.095 BFLOPs
85 upsample 2x 19 x 19 x 256 -> 38 x 38 x 256
86 route 85 61
87 conv 256 1 x 1 / 1 38 x 38 x 768 -> 38 x 38 x 256 0.568 BFLOPs
88 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
89 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
90 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
91 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs
92 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs
93 conv 18 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 18 0.027 BFLOPs
94 yolo
95 route 91
96 conv 128 1 x 1 / 1 38 x 38 x 256 -> 38 x 38 x 128 0.095 BFLOPs
97 upsample 2x 38 x 38 x 128 -> 76 x 76 x 128
98 route 97 36
99 conv 128 1 x 1 / 1 76 x 76 x 384 -> 76 x 76 x 128 0.568 BFLOPs
100 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
101 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
102 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
103 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs
104 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs
105 conv 18 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 18 0.053 BFLOPs
106 yolo
Loading weights from ./backup/rsna_yolov3_15300.weights...Done!
./images/395738ff-53d2-42e6-aa6d-a01e4a07f01f.jpg: Predicted in 22.544326 seconds.
pneumonia: 99%
pneumonia: 96%
#Plot the prediction with the position of Lung Inflammation
plt.imshow(cv2.imread("predictions.jpg"))
<matplotlib.image.AxesImage at 0x7fba67a881d0>
Results:
Classification
Locating the position of inflammation in an image
YOLO v3
The model has several advantages over classifier-based systems. It looks at the whole image at test time so its predictions are informed by global context in the image. It also makes predictions with a single network evaluation unlike systems like R-CNN which require thousands for a single image.
Limitations:
The training time was high even though Google Colab Pro (TPU and GPU) were used, and we were forced to reduce the number of training epochs to avoid the continuous usage timeouts before finishing execution of all Models.
Accuracy can be improved by further hyperparameter tuning and allowing the Models to train for more epochs, computing resources permitting.
However, sampling of the dataset may also help mitigate this challenge although it may also have an effect on the Model Accuracy.
Upsampling for Class balance.
Stratified splitting in the Train_Test_Split.
Way Forward/ Future Work:
References:
[1] https://www.kaggle.com/c/rsna-pneumonia-detection-challenge/overview
[2] https://keras.io/api/applications/vgg/
[3] https://www.kaggle.com/keras/resnet50
[4] https://keras.io/api/applications/inceptionv3/
[5] YOLOv3: An Incremental Improvement, Joseph Redmon, Ali Farhadi, University of Washington